DIM: how to load metadata to HFM

Hi,
We want to load metadata into the HFM application through HFM adapter in the informatica PowerCenter (v8.1.1).
The data could be loaded to HFM, but the adapter couldn't load metadata to build HFM dimensions dynamically.
Are there any method to load metadata to HFM in the PowerCenter?
Thank you in advance.

I reccomend you contact Informatica, for it is their product stopping you from doing the loading not HFM itself.

Similar Messages

  • DIM: how to load metadata to Essbase without using rule files

    Hi,
    The Essbase adapter has been installed in the informatica PowerCenter (v8.1.1). We want to create a Essbase target definition to load metadata. In the Table Creation Wizard, we select Table Type: Dynamic dimension building (Type 3), but it needs to specify Rules file in the Column Creation Wizard.
    Any method to load metadata into Essbase without using Rules file?
    Thank you in advance.

    You could load the data into Essbase without Rule file by means of free form loading which the Datasource would be a file.

  • Loading Metadata ODI HFM

    Im Loading metadata and the process cant load.
    The error of the log is this one
    etadata Load started at 11:47:03
    PASS I - Analyzing Section Headers
    PASS II - Validating member names and hierarchies
    Entity
    Account
    Custom1
    Custom2
    Custom3
    Custom4
    PASS III - Validating all sections
    Accounts...
    Accounts OK
    Metadata referential integrity check started at 11:47:03
    PASS I - Analyzing Section Headers
    PASS II - Validating member names and hierarchies
    Entity
    Account
    Custom1
    Custom2
    Custom3
    Custom4
    PASS III - Validating all sections
    Accounts...
    Accounts OK
    Pass IV - updating database
    Removing metadata items no longer used...
    Account
    13 parent/child pair(s) removed from hierarchy
    12 member(s) removed
    Done removing metadata items
    Renaming members...
    Account
    Accounts...
    Error: Invalid tree insertion point
    Error: Invalid tree insertion point
    any idea???

    Check out John's Blog => http://john-goodwin.blogspot.com/2010/02/odi-series-loading-hfm-metadata.html
    Thank you,
    Todd Rebner

  • Loading Metadata to HFM Classic from FDM

    Hi,
    Can anyone please explain is there any possibility to load Metadata (Members) in HFM Classic Application using FDM.
    If so please explain in detailed.
    Thanks in advance

    FDM is not designed to load metadata but data. It can be done but would involve a fair bit of custom code (including a wrapper dll) as the metadata load functionality is not exposed by the HFM web SDK. I'd look at alternative method for doing this and stick to loading data via FDM.

  • How to load metadata directly into essbase 9.3 ?

    Hi all,
    When I load metadata directly into essbase with IKM SQL to Hyperion Essbase (METADATA).
    My metadata is already on Microsoft SQL 2000. After execute an interface that load metadata to essbase and check in the essbase outline , It seem that outline data could not retrieve correctly.And still have many rejected records.
    My question is :
    Can I use the same rule file that used for load a metadata from text file?
    Because in this case I use the same rule file that load metadata from text file.
    Let me know if need more information on this.
    Thank you in advance.

    Hi,
    You can use the same rules file, make sure in the IKM options you have the same rules seperator as in your rules file.
    You can also turn on error logging and logging in the IKM options so you should get more information to why the metaload is failing
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Loading metadata into HFM application through EPMA - Beginner

    Hello,
    I have an HFM application which was created in a classic format and then later on converted to EPMA. I would like add more metadata in it now either manually or any other way. I tried to create a few members manually through dimension library but when I see it in the actual application, the members are not seen. Is there anything else to do like deploying it or anything? It would be great help if anybody could tell me how to add members manually and in bulk.
    Thanks

    Yes. You'll need to deploy the application to see the changes outside of EPMA. Since EPMA is your metadata management tool now, you'll make the changes there, and then deploy (or load) it to the application. There are several ways to make metadata changes in EPMA:
    1) Directly in the Dimension Library as you have done
    2) Using the Grid Editor functionality in the Dimension Library (right-click any dimension label and you'll see the option)
    3) Using the EPMA File Generator (this is best for building new apps and regularly occurring changes)
    4) Using LCM (this is my preference for bulk changes)
    Shoot an email to the address in my profile and I'd be happy to get you some further info.

  • How to load data in HFM from oracle database without FDM

    hi all,
    I want to know that whether we can load data directlyrom oracle database residing on different server without writing the integartion script in FDM.
    points to note:
    we want to load data without using
    1.FDM
    2.Flat file
    3.Excel file or csv or dat file

    The process is described in the EPMA administrator's guide, which you can find in pdf format included in the HFM product documentation. The file name is bpma_admin.pdf. In the case that you have installed HFM (and included EPMA during the installation process), you may find this file in a folder that will look like C:\Hyperion\FinancialManagement\Documentation\en.
    If you do not have installed the products yet, you should bear in mind that all Hyperion product documentation (including HFM) can be downloaded from the e-delivery site of Oracle (http://edelivery.oracle.com). The Hyperion product documentation is a large archive (zip), containing smaller archives per product. The smaller archive you need to open once you download the complete product documentation is the hfm_93100_product_doc.zip.

  • Loading metadata and creating users in FDM

    hi friends ,
    1) I am using HFM and FDM9.3.1 Can we load HFM metadata using FDM.If yes pls tell me how?
    2)Pls tell me the steps to configure shared services with FDM.Can we configure only msad and ldap with FDM?
    Is there any role of vb authentication scripts for shared services user or is it only forusers created in user management in FDM?

    Hello,
    Currently FDM can not load metadata to HFM. You would need something like MDM/DRM as that is their job.
    FDM 9.3.1.x only integrates with MSAD/LDAP/NTML providers. While SharedServices leverages the OpenLDAP software, it may be possible. I would recommend creating a Support SR to obtain the process.
    Fusion Edition (11.1.1.x) directly integrates with SharedServices.
    Thank you.

  • Error Loading Metadata

    Hi all,
    I'm using HFM 11.1.1. While Loading Metadata in HFM client, i'm facing following error message.
    "This is a BPMA application. This functionality done through BPMA"
    Can anyone give suggestion to resolve this issue.

    You will not see the phrase "classic" application anywhere in the user interface, though you will see it in the product documentation. "Classic" simply describes an application created using the pre-version 9.3.1 mode where you created an application profile and metadata file using the Win32 interface. This generates either an *.app or *.xml file which you can load directly into the application either through the web or Win32 interface.
    "EPMA" is a term introduced with the EPM Architect module in version 9.3.1 (and really improved with the 11.1.1 and 11.1.2 releases) which allows the application to be created in the EPMA web interface only, and also create and manage metadata there as well. The module does allow for an externaly created metadata file to be loaded (*.ads format) but only into the dimension library and not directly into the HFM application. Once the metadata exists in the dimension library, you "deploy" it into the HFM application, and so it is an indirect rather than direct load.
    Product functionality allows for a "classic" application to be created and then upgraded/converted to an EPMA application. You can do this if you want to start in "classic mode" or if you want to convert a previously existing application up to EPMA. Once you convert an application to EPMA mode, you cannot convert back to classic mode. You can have a mix of EPMA and classic applications in an environment. The EPMA applications will show up in the application library but the classic ones will not.
    You can learn more in the HFM Administrator's Guide, under "Managing Applications", and by also reading Oracle Hyperion Enterprise Performance Management Architect Administrator’s Guide.
    --Chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Load metadata for idoc

    hi,
    how to load metadata for idoc..
    I hope IDX2 is for to check whether metadata is loaded or not..
    thanks
    guna

    Hi,
    use :
    IDoc (Intermediate Document) metadata comprises structures for the corresponding IDoc types that are required by the IDoc adapter to convert these IDocs to IDoc XML format and the other way around.
    Using an RFC connection, metadata of this type can be either called directly at runtime or loaded to the Integration Server beforehand.
    The system containing the metadata is either the sender or receiver SAP system or, if the sender or receiver system is a subsystem, the SAP reference system where the metadata is saved.
    You can display metadata that has already been loaded, or if you are upgrading the application system then you can delete the metadata and reload it.
    Prerequisites
    To access the metadata in the sender system, you must establish an RFC connection to this system using the port maintenance in the IDoc adapter.
    Procedure
           1.      To find out what metadata has already been loaded, call the transaction Metadata Overview for IDoc Adapter (IDX2).
    The system displays a screen with the directory of all systems connected with the IDoc adapter (including a description) for which metadata has already been loaded. Choose Port Maintenance in IDoc Adapter () to call the corresponding transaction and to create additional ports.
           2.      Expand the individual systems to display the IDoc types and clients including a description for each system. The system displays the metadata for each connected system for which metadata has already been loaded.
           3.      To apply the metadata structure loaded from a particular system to another system (for example, to an SAP reference system), select the link to the corresponding IDoc type and choose Copy ().
    The system displays a dialog box in which you can copy the IDoc type description to another system (Target Port).
           4.      To delete metadata that has already been loaded, select the link to the corresponding IDoc type and choose Delete ().
    The system asks you if you really want to delete the selected structure.
           5.      To delete the metadata structure, choose Yes.
           6.      To load additional metadata, choose Create ().
    The system displays a dialog box where you can enter the IDoc Type including Extension and the system (Source Port).
           7.      Make the required entries and choose Continue.
    The new structure is inserted in the tree structure as follows:
    ¡        If the structure originates from a system from which metadata has already been loaded, it is inserted below the structures already loaded from this system.
    ¡        If the structure originates from a system from which no metadata has already been loaded, it is inserted together with the system below the already listed systems.
           8.      To display details about a metadata structure that has already been loaded, choose the link to the corresponding IDoc type. To display this structure in detail, choose Display (). The system displays the structure in a hierarchy tree.
           9.      To display the corresponding system (port) and the basic type from the detailed display, choose Header Information ().
       10.      To display the segment versions of all segment types in the structure, choose All Segment Versions ().
       11.      Select a segment to display all its fields.
    and also refer these article
    http://help.sap.com/saphelp_erp2004/helpdata/en/b9/c5b13bbeb0cb37e10000000a11402f/content.htm
    http://publib.boulder.ibm.com/infocenter/wtxdoc/v8r2m0/index.jsp?topic=/com.ibm.websphere.dtx.packsapxi.doc/tasks/t_pack_sapxi_Set_up_the_SAP_XI_System.htm
    Regards,
    Suryanarayana

  • Loading Metadata from ODI to Hyperion Planning Custom Dimension

    Customer want to load a metadata from ODI to Hyperion planning customer dimension using flat files(.txt files).
    Is it possible to load the metadata into custom dimension? If this is possible, do we need any other KM for planning except RKM Hyperion Planning?
    Because when i try to map the dimension from source to taget the connection is blanck. Getting "Used by target columns none".
    Please refer the image
    http://1.bp.blogspot.com/_Z0lKn46L41I/TJuZcsQxIjI/AAAAAAAAA90/TTv79fbQ9ks/s1600/ODIIssue.JPG
    Thanks
    Vikram

    Yes you can load to custom dimensions just like the other dimensions, the custom dimensions should be reversed into the model.
    You need to use the IKM SQL to Hyperion Planning, make sure you set the staging area different than the target.
    If you want to see how to load metadata to planning, have a read of :- http://john-goodwin.blogspot.com/2008/10/odi-series-part-5-sql-to-planning.html
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • How to load new planning application in EPM 11.1.1.2-DIM ESSBASE ADAPTER

    how to load new planning application in EPM 11.1.1.2-DIM ESSBASE ADAPTER

    If you trying load metadata into planning using DIM then have a look here :- http://www.oracle.com/technology/obe/hyp_fp/DIM_Planning/OBE_Dim_Planning.html
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Problem loading metadata from ODI to HFM ( Error code: 0x80040154 [Class...

    Hi Experts
    I have an issue when I load metadata via ODI to HFM. I get this message:
    Error code: 0x80040154 [Class not registered
    com.hyperion.odi.common.ODIHAppException: Metadata load failed. Error code: 0x80040154 [Class not registered
    After some search on the net I see that this is due to a patch which is already been installed(ODI 10.1.3.5.5). The HFMDriver.dll is renamed as HFMDriver32.dll and HFMDriver64.dll rename as HFMDriver.dll.
    Patch 9377717: ORACLE DATA INTEGRATOR 10.1.3.6.0 PATCH
    1. I have reversed the HFM application TestApp into target for ODI and everything seems fine in the operator
    2. I have created a simple source flat file for the Account dimension
    3. I have created an interface to the Account dim. and verified the mapping with no errors
    4. The process stops when trying to Load the Metadata to HFM(Step 5 / of 7)
    5. When I search the log I see the Error code: 0x80040154 [Class not registered]
    Does anyone have any idea why the interface does not load the metadata?
    Brs
    Inge Andre
    Edited by: 819836 on Apr 14, 2011 12:49 PM

    This instructions given us by the support have fixed the problem on the first topic.
    1. Unregister the adapter by opening the command prompt and changing the path to
    C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\ RegSvcs.exe /u C:\Hyperion\products\FinancialDataQuality\SharedComponents\AdapterComponents\fdmAdapterCOMadmin\fdmAdapterCOMadmin.dll
    and:
    RegSvcs.exe /u C:\Hyperion\products\FinancialDataQuality\SharedComponents\AdapterComponents\fdmFM11xG5C\fdmFM11xG5C.dll
    Please verify the correct path to the dll's before proceeding
    2. Re-register the adapter using the 64 bit version of RegSvcs, C:\WINDOWS\Microsoft.NET\Framework64\v2.0.50727\RegSvcs.exe <PathToDLL> i.e. ..\Hyperion\products\FinancialDataQuality\SharedComponents\AdapterComponents\fdmFM11xG5C\fdmFM11xG5C.dll
    N.B. Do not re-register the ComAdmin.dll because that is not a 64 bit component.
    4. Open the FDM workbench and configure the adapter by right clicking the adapter -> Configure and re-entering the username and password.
    Regards.

  • How to load BeginningBalance and movements into HFM via FDM from ERPI

    I have a file generated by ERPI with the columns of |AMOUNT|BEGIN_BALANCE_DR|BEGIN_BALANCE_CR|PERIOD_NET_DR|PERIOD_NET_CR|
    among many other columns.
    How can load them via FDM into the Custom1 dimension in HFM, which contains BeginningBalance, Additions, Disposals, etc.
    So basically I will have a trial balance, which I can compare with GL and OBI.

    Hello
    You can use the IDOC SOPGEN01 for that purpose.
    Please check transaction WE60 for IDOC documentation and there are many old threads with information about this IDOC.
    BR
    Caetano

  • How can i load Metadata file into georaster?

    Hello everybody,
    I stone a tiff file in my db,I stone tiff file like this:
    1.create GeoRaster Table:
    create table rm_image_t(
    georid number,
    file_type varchar2(30),
    image_file mdsys.sdo_georaster);
    2.create trigger:
    exec sdo_geor_utl.createdmltrigger('rm_image_t','image_file');
    3.Create raster data table:
    create table rdt1 of mdsys.sdo_raster(
    primary key(rasterid,pyramidlevel,bandblocknumber,rowblocknumber,columnblocknumber))
    lob(rasterblock) store as (nocache nologging);
    4.Grant:
    exec dbms_java.grant_permission('OPER','SYS:java.io.FilePermission','c:\aaa.tif','read');
    call dbms_java.grant_permission('MDSYS','SYS:java.io.FilePermission','c:\aaa.tif','read' );
    call dbms_java.grant_permission('OPER','SYS:java.io.FilePermission', 'c:\aaa.tif','read' );
    call dbms_java.grant_permission('PUBLIC','SYS:java.io.FilePermission','c:\aaa.tif','read' );
    call dbms_java.grant_permission('MDSYS','SYS:java.io.FilePermission','c:\aaa.tif','read' );
    call dbms_java.grant_permission('OPER','SYS:java.io.FilePermission', 'c:\aaa.tif','read');
    call dbms_java.grant_permission('PUBLIC','SYS:java.io.FilePermission','c:\aaa.tif','read');
    5.Insert tiff file:
    DECLARE
    geor SDO_GEORASTER;
    BEGIN
    -- Initialize an empty GeoRaster object into which the external image
    -- is to be imported.
    INSERT INTO rm_image_t
    values( 1, 'TIFF', sdo_geor.init('rdt1') );
    -- Import the TIFF image.
    SELECT image_file INTO geor FROM rm_image_t
    WHERE georid = 1 FOR UPDATE;
    sdo_geor.importFrom(geor,'blocksize=(512,512) compression=DEFLATE', 'TIFF', 'file','c:\aaa.tif');
    UPDATE rm_image_t SET image_file = geor WHERE georid = 1;
    COMMIT;
    END;
    Then:
    SQL> SELECT t.georid,
    2 sdo_geor.validategeoraster(t.image_file) isvalid
    3 from rm_image_t t order by georid;
    GEORID ISVALID
    1 TRUE
    But I do not know how to load tiff_file's metadata into georaster.
    Can you help me?thanks!

    My metadata file called aaa.met,it has the following info:
    GROUP = METADATA_FILE
         PRODUCT_CREATION_TIME = 2004-02-12T15:09:20Z
         PRODUCT_FILE_SIZE = 703.1
         STATION_ID = "EDC"
         GROUND_STATION = "SGS"
         GROUP = ORTHO_PRODUCT_METADATA
              SPACECRAFT_ID = "Landsat7"
              SENSOR_ID = "ETM+"
              ACQUISITION_DATE = 2001-10-03
              WRS_PATH = 138
              WRS_ROW = 036
              SCENE_CENTER_LAT = +34.6152272
              SCENE_CENTER_LON = +91.7569675
              SCENE_UL_CORNER_LAT = +35.5648302
              SCENE_UL_CORNER_LON = +90.9958720
              SCENE_UR_CORNER_LAT = +35.2772270
              SCENE_UR_CORNER_LON = +92.9740331
              SCENE_LL_CORNER_LAT = +33.9414953
              SCENE_LL_CORNER_LON = +90.5596099
              SCENE_LR_CORNER_LAT = +33.6605964
              SCENE_LR_CORNER_LON = +92.5005228
              SCENE_UL_CORNER_MAPX = 318373.500
              SCENE_UL_CORNER_MAPY = 3937531.500
              SCENE_UR_CORNER_MAPX = 497638.500
              SCENE_UR_CORNER_MAPY = 3903787.500
              SCENE_LL_CORNER_MAPX = 274455.000
              SCENE_LL_CORNER_MAPY = 3758352.000
              SCENE_LR_CORNER_MAPX = 453691.500
              SCENE_LR_CORNER_MAPY = 3724636.500
              BAND1_FILE_NAME = "aaa.tif"
              GROUP = PROJECTION_PARAMETERS
                   REFERENCE_DATUM = "WGS84"
                   REFERENCE_ELLIPSOID = "WGS84"
                   GRID_CELL_ORIGIN = "Center"
                   UL_GRID_LINE_NUMBER = 1
                   UL_GRID_SAMPLE_NUMBER = 1
                   GRID_INCREMENT_UNIT = "Meters"
                   GRID_CELL_SIZE_PAN = 14.250
                   GRID_CELL_SIZE_THM = 57.000
                   GRID_CELL_SIZE_REF = 28.500
                   FALSE_NORTHING = 0
                   ORIENTATION = "NUP"
                   RESAMPLING_OPTION = "NN"
                   MAP_PROJECTION = "UTM"
              END_GROUP = PROJECTION_PARAMETERS
              GROUP = UTM_PARAMETERS
                   ZONE_NUMBER = +46
              END_GROUP = UTM_PARAMETERS
              SUN_AZIMUTH = 147.9348938
              SUN_ELEVATION = 46.4220192
              QA_PERCENT_MISSING_DATA = 0
              CLOUD_COVER = 0
              PRODUCT_SAMPLES_PAN = 17814
              PRODUCT_LINES_PAN = 15754
              PRODUCT_SAMPLES_REF = 8907
              PRODUCT_LINES_REF = 7877
              PRODUCT_SAMPLES_THM = 4454
              PRODUCT_LINES_THM = 3939
              OUTPUT_FORMAT = "GEOTIFF"
         END_GROUP = ORTHO_PRODUCT_METADATA
         GROUP = L1G_PRODUCT_METADATA
              BAND_COMBINATION = "123456678"
              CPF_FILE_NAME = "L7CPF20011001_20011231_04"
              GROUP = MIN_MAX_RADIANCE
                   LMAX_BAND1 = 191.600
                   LMIN_BAND1 = -6.200
                   LMAX_BAND2 = 196.500
                   LMIN_BAND2 = -6.400
                   LMAX_BAND3 = 152.900
                   LMIN_BAND3 = -5.000
                   LMAX_BAND4 = 241.100
                   LMIN_BAND4 = -5.100
                   LMAX_BAND5 = 31.060
                   LMIN_BAND5 = -1.000
                   LMAX_BAND61 = 17.040
                   LMIN_BAND61 = 0.000
                   LMAX_BAND62 = 12.650
                   LMIN_BAND62 = 3.200
                   LMAX_BAND7 = 10.800
                   LMIN_BAND7 = -0.350
                   LMAX_BAND8 = 243.100
                   LMIN_BAND8 = -4.700
              END_GROUP = MIN_MAX_RADIANCE
              GROUP = MIN_MAX_PIXEL_VALUE
                   QCALMAX_BAND1 = 255.0
                   QCALMIN_BAND1 = 1.0
                   QCALMAX_BAND2 = 255.0
                   QCALMIN_BAND2 = 1.0
                   QCALMAX_BAND3 = 255.0
                   QCALMIN_BAND3 = 1.0
                   QCALMAX_BAND4 = 255.0
                   QCALMIN_BAND4 = 1.0
                   QCALMAX_BAND5 = 255.0
                   QCALMIN_BAND5 = 1.0
                   QCALMAX_BAND61 = 255.0
                   QCALMIN_BAND61 = 1.0
                   QCALMAX_BAND62 = 255.0
                   QCALMIN_BAND62 = 1.0
                   QCALMAX_BAND7 = 255.0
                   QCALMIN_BAND7 = 1.0
                   QCALMAX_BAND8 = 255.0
                   QCALMIN_BAND8 = 1.0
              END_GROUP = MIN_MAX_PIXEL_VALUE
              GROUP = PRODUCT_PARAMETERS
                   CORRECTION_METHOD_GAIN_BAND1 = "CPF"
                   CORRECTION_METHOD_GAIN_BAND2 = "CPF"
                   CORRECTION_METHOD_GAIN_BAND3 = "CPF"
                   CORRECTION_METHOD_GAIN_BAND4 = "CPF"
                   CORRECTION_METHOD_GAIN_BAND5 = "CPF"
                   CORRECTION_METHOD_GAIN_BAND61 = "CPF"
                   CORRECTION_METHOD_GAIN_BAND62 = "CPF"
                   CORRECTION_METHOD_GAIN_BAND7 = "CPF"
                   CORRECTION_METHOD_GAIN_BAND8 = "CPF"
                   CORRECTION_METHOD_BIAS = "IC"
                   BAND1_GAIN = "H"
                   BAND2_GAIN = "H"
                   BAND3_GAIN = "H"
                   BAND4_GAIN = "L"
                   BAND5_GAIN = "H"
                   BAND6_GAIN1 = "L"
                   BAND6_GAIN2 = "H"
                   BAND7_GAIN = "H"
                   BAND8_GAIN = "L"
                   BAND1_GAIN_CHANGE = "0"
                   BAND2_GAIN_CHANGE = "0"
                   BAND3_GAIN_CHANGE = "0"
                   BAND4_GAIN_CHANGE = "0"
                   BAND5_GAIN_CHANGE = "0"
                   BAND6_GAIN_CHANGE1 = "0"
                   BAND6_GAIN_CHANGE2 = "0"
                   BAND7_GAIN_CHANGE = "0"
                   BAND8_GAIN_CHANGE = "0"
                   BAND1_SL_GAIN_CHANGE = "0"
                   BAND2_SL_GAIN_CHANGE = "0"
                   BAND3_SL_GAIN_CHANGE = "0"
                   BAND4_SL_GAIN_CHANGE = "0"
                   BAND5_SL_GAIN_CHANGE = "0"
                   BAND6_SL_GAIN_CHANGE1 = "0"
                   BAND6_SL_GAIN_CHANGE2 = "0"
                   BAND7_SL_GAIN_CHANGE = "0"
                   BAND8_SL_GAIN_CHANGE = "0"
              END_GROUP = PRODUCT_PARAMETERS
              GROUP = CORRECTIONS_APPLIED
                   STRIPING_BAND1 = "NONE"
                   STRIPING_BAND2 = "NONE"
                   STRIPING_BAND3 = "NONE"
                   STRIPING_BAND4 = "NONE"
                   STRIPING_BAND5 = "NONE"
                   STRIPING_BAND61 = "NONE"
                   STRIPING_BAND62 = "NONE"
                   STRIPING_BAND7 = "NONE"
                   STRIPING_BAND8 = "NONE"
                   BANDING = "N"
                   COHERENT_NOISE = "N"
                   MEMORY_EFFECT = "N"
                   SCAN_CORRELATED_SHIFT = "N"
                   INOPERABLE_DETECTORS = "N"
                   DROPPED_LINES = N
              END_GROUP = CORRECTIONS_APPLIED
         END_GROUP = L1G_PRODUCT_METADATA
    END_GROUP = METADATA_FILE
    END
    I want load it into db and query the tif_file use coordinates.
    Can you pls tell me how to do it?

Maybe you are looking for