Conversion of .dgn file to .shp files for spatial data processing

Hi All,
Presently I am working oracle spatial database (10gR2). I have following task:
Task: Spatial data loading into oracle database 10g Release 2:
Task process description: Client has given the input file as with .dgn extension. So we have to convert that to .shp file and load to spatial database.
There is utility in OTN site for loading data .shp file data into spatial database. This we can able to do.
Now I am looking for how to convert the .dgn file to .shp file? I mean, if there is any tool for it or any commands to do this in specific environment.
Any help would appreciate.
Thanks,
[email protected]

Hi,
There are several ways to do this.
Get a general GIS format translator. I use FME. This can be obtained from SAFE at safe.com. There is a free 30 day evaluation.
Get GIS software which has a data translation facility. Take your pick of the systems.
Write some code to read the DGN into Oracle. The DGN format can be found at http://www.bentley.com/en-US/Products/MicroStation/OpenDGN/. You need to know the version of DGN.
Ask your client to supply data in SHP format.
Ivan

Similar Messages

  • Can't open MOV files in FCP: "Searching for movie data in file..."

    I know this has been posted before, but none of them seem to match up to what I'm all of a sudden dealing with. I have 4 MOV files that I'm converting to WMV. Movie 1 & 2 exported fine. Now, however, any time I try to open another MOV file, I get an error that it's "Searching for movie data in file..." and the file name is ALWAYS the same. The weird thing is, the file it's looking for is none of the 4 movies I'm working with. So how did the first two work and this pop out of nowhere??
    I have cleared out the render cache. Dumped all the cache and prefs files I can find, etc. Restarted the machine. Started FCP with the Option key held down.
    I've searched everywhere, but people usually seem to come across this error while opening FCP (I'm running 6.0.6). This only happens when I open a MOV file (oddly, I can open WMV files in FCP without issue).
    HELP!

    So, I open the movies in QT and the EXACT same thing is happening. So it appears the guy who created the videos did something when he was making them (probably made a reference files along the way). I'm having him re-export the MOV files.
    So, it's not a FCP issue.....

  • Write to measurement file Express VI - TDMS file has separate "channels" for each data point

    Im trying to write a VI to measure and record thermocouple data from an Advantech T/C DAQ. Using the "DAQNavi" express VI provided by them, connected to the Write to Measurement File express VI, I have managed to read in the data and create a TDMS file. However, when I open the TDMS file, each time step of temperature data is entered as a separate channel, instead of all of the channel data going into one tab. Obviously this is a huge problem as it creates hundreds of tabs after just a few seconds. Any thoughts as to what causes this?

    Hi glibby,
    How did you configure the Write to Measurement Express VI? Please select "one header only".
    If you have your own timestamps to write, please merge your timestamp channel and measurement channels with "Merge Signals" before passing them to the Write to Measurement.
    Best Regards,
    Mavis

  • New tutorials posted for Text Data Processing on Data Services 4.0

    Check out the 3 new Text Data Processing tutorials available at http://wiki.sdn.sap.com/wiki/display/BOBJ/TextDataProcessing within the Product Tutorials section.
    -- Introduction to Entity Extraction Transform: this 15-minute demo provides an overview of the Text Data Processing Entity Extraction transform for Data Services 4.0. The Entity Extraction transform enables you to process unstructured text, such as web pages, news articles, maintenance logs, text from a spreadsheet, or even from a database column, to extract key pieces of information that you can use in query and reporting.
    -- Using Text Data Processing Blueprints: this 6-minute demo explains how to get up and running with TDP quickly using a series of jobs contained in a blueprint. A blueprint is a sample end-to-end solution available on SAP Community Network.
    -- Creating an Extraction Dictionary: this 13-minute demo explains how to create an Extraction Dictionary, which can be used to customize a TDP Entity Extraction transform to improve results.

    Thanks for sharing your use case.  I have a few questions if you don't mind:
    1) What file formats are typically found within these compressed containers?  For example, html, xml, txt, pdf, MS Office, etc.
    2) Do these compressed containers ever have other embedded compressed containers within them?  For example, a zip file containing other zip files within it.
    3) If the intention is to process document files, do any of the document files have other document formats embedded/nested within them?  For example, a MS Word document with a spreadsheet embedded within it.

  • Oracle Content Management for spatial data

    Hi,
    Is it possible to use Oracle universal content management sofware for managing GIS files/spatial data.
    Can we integrate Oracle Universal content management or anyother tool to manage spatial files. I am looking for standard content management features like login/chekin/checkout/security/previes for spatial files etc..
    Thanks

    Hi,
    -Does this product required additional licensing cost or it came with bundle Please contact your Oracle Sales representative for license queries and questions.
    Global Pricing and Licensing
    http://www.oracle.com/us/corporate/pricing/index.htm
    -Does this product require additional server or it can be run on application server?
    -What are the key feature we can get from Oracle Content Management Server?
    -What kind of document we can store in Oracle Content Management Server?
    -How much additional resources required if we deploy this product?Please refer to the following links.
    Note: 305373.1 - Oracle Document Management in Release 11i through 12 - Installation and Configuration
    https://metalink2.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=305373.1
    Oracle Universal Content Management
    http://www.oracle.com/products/middleware/content-management/universal-content-management.html
    Oracle Content Management 10gR3 (10.1.3.3)
    http://download.oracle.com/docs/cd/E10316_01/ouc.htm
    Regards,
    Hussein

  • MDSYS Schema in the Database for spatial data

    Hi
    As MDSYS Schema is required in the oracle database to hold the spatial data, The client database server has no MDSYS schema installed.
    So, when i import the mvdemo.dmp file in the database it is erroring out since there is no spatial data, Please let me know how to get the MDSYS schema in the database.
    Thank You

    Hi thanks for your response, i found this out in the oracle site, DBA ran this script for creation of MDSYS..but there was no data in some of the tables which are needed for development.
    --- COMPATIBLE init.ora parameter is set to 9.0.0.0.0 or higher
    show parameter compat
    NAME TYPE VALUE
    compatible string 10.2.0.3.0
    If you create an Oracle database using the Database Configuration Assistant (DBCA), Spatial is installed by default and you do not need to perform the installation steps described in this section.
    Steps to create the MDSYS schema and objects
    1) Connect to the database instance specifying AS SYSDBA.
    2) Create the MDSYS user with a command in the following format:
    SQL> CREATE USER MDSYS IDENTIFIED BY <password>;
    3) Grant the required privileges to the MDSYS user by running the following procedure:
    SQL> @ORACLE_HOME/md/admin/mdprivs.sql
    4) Connect as MDSYS.
    5) Install Spatial by running the following procedure:
    SQL> @ORACLE_HOME/md/admin/catmd.sql
    SQL> ALTER USER MDSYS ACCOUNT LOCK;
    So this solution dint work out.......

  • Anyone have on opinion on the usage of SECUREFILE LOBs for spatial data?

    Hi folks,
    From everything I read the usage of SECUREFILE LOBs over BASICFILE LOBs appears to be a nobrainer.
    http://www.oracle.com/technetwork/database/options/compression/overview/securefiles-131281.pdf
    Yet the default table creation remains set to create BASICFILE LOBs so my 11g spatial data is still sitting in BASICFILE LOBs (also I need to push it to and from 10g for the moment).
    Should we all be changing over to SECUREFILE LOB storage as a matter of course on 11g? Is there a big payoff for VARRAY storage beyond compression? A little payoff?
    Oracle says SECUREFILE LOBs "dramatically improve performance". Hey, I want some of that!
    http://www.oracle.com/newsletters/sap/products/database/oradb11g-features.html
    But yet there just has not been much traffic on the topic.
    Pro Oracle Spatial mentions weakly on page 250 that
    "Secure File LOBs are expected to be faster than BASIC LOBs".
    but seems to only be referring to the spatial index.
    Here Godfrind says to use them with points clouds but only trumpets the compression aspect.
    http://www.ncg.knaw.nl/Studiedagen/09PointClouds/presentations/PointCloud_14_AlbertGodfrind.pdf
    For those of us using ArcSDE, its pretty easy to add the SECUREFILE LOB keyword to the dbtune keyword that governs creation of the spatial column. But not a peep on the topic from the ESRI folks that I can find.
    Anyone have any experiences or opinions to share? I am hoping to finally leave 10g behind one day this year and putting together a migration plan. Should converting the LOBs be a first day bullet item or just something to consider sometime down the road?
    Thanks!
    Paul

    I have been writing on the benefits of rounding ordinates on storage and performance.
    That series has not yet been finished and fully published but my main finding (backed by stats/graphs) is that rounding ordinates does have an effect on performance as average coordinates per feature increases.
    As part of the work I also looked at SECUREFILE LOB storage in the hope that some sort of zlib/rle etc compression of the sdo_ordinates would substantially reduce storage size and thus increase performance.
    The reason I looked at this is because a customer has migrated from SDEBINARY to SDO_GEOMETRY (though most others are going from SDEBINARY to SDE.ST_GEOMETRY) and saw a substantial increase in storage costs. They also have taken a performance hit in some aspects of their operations but that may be due to ArcSDE programmer SDO_GEOMETRY access issues rather than any real issues with Oracle performance (though the perception is very much that Oracle is to blame). SDEBINARY was always a compressed storage format and with SDE.ST_GEOMETRY being WKB (of some description) also has a low storage footprint.
    The result of my work on LOB storage, though not published, is that I agree with Paul: there is little benefit for a lot of pain.
    However, I do still the whole question of storage size as being an area of benefit if reductions could be made as this is a necessary precursor for spatial being able to reap benefits if column-oriented storage ever migrates down to standard Oracle database.
    regards
    Simon

  • Query for spatial data with a GeometryCollection fails

    There are exact 538 CurvePolygons (only exterior rings at this
    sample). All of them are valid geometries and equal in dimension
    and so on. Now I connect them to a GeometryCollection and query
    for other relating spatial data in some tables. It seems that
    the use of around (not exact!) 200 CurvePolygon in one
    GeometryCollection works fine but the adding of more
    CurvePolygon result in an error with the Spatial Index (I could
    add the ORA- error numbers if I have some data in my test tables
    again next days).
    Is there anybody else having trouble with these mysterious
    problem? Maybe there is a border by the number of points in
    GeometryCollection?
    (More details, programming code could be delivered)
    (working with Java 1.3.1, oracle.sdoapi.*, Oracle 8.1.7.)

    Hi Lutz,
    Could you provide more info or samples of what is going wrong?
    Also, could you try making sure the geometry you are passing in
    as the query window is valid (i.e. instead of passing it in as a
    query window, pass it into sdo_geom.validate_geometry).
    Thanks,
    Dan

  • Call Bundling for custom bapi for mass data processing

    Hi all,
    http://help.sap.com/saphelp_erp2005vp/helpdata/en/4c/4c0e96725311d396a80004ac96334b/frameset.htm
    can i create a custom bapi where i can compress created update tasks. Not single inserts but a single sql insert with many records.
    Are there some SAP FM in order to do it? Documentation says, i must do "Operations in buffer" and "Update buffer data".
    Regards
    Paul

    Is ABAPFIELD an IMPORTING parameter?
    > Total Questions:  17 (15 unresolved) 
    Maybe you should consider cleaning up your old posts.
    Rob

  • Frequent "Searching for Movie Data in File ..." errors

    OK, I searched the forums for this but didn't find anything, so don't nobody jump down my throat please ...
    I very often find myself, when trying to export a file through QuickTime conversion, confronted with the error message "Searching for Movie Data in File [XXX - 0000000008]" which FCP invariably cannot find. I go to look in my capture drive myself, and sure enough, no file exists of that name. Without the file in question, FCP won't run the conversion. I get around it by exporting directly to QT, opening that file in QT Pro and re-exporting, but what the he11 is FCP doing in the first place? Why is it making all these files then losing/destroying them?
    Help very much appreciated. Running FCP 4.5 HD; rest of specs in profile.
    Thanks!

    Without being able to see your project's history, I can only surmise you have created some copies of some clips and then tossed the originals. Or used reference movies that point to renamed clips. Or you've renamed some clips but used a differently-named copy of it. You may have copied a clip from one project and placed into another and then later discarded the source media.
    I get around it by exporting directly to QT, opening that file in QT Pro and re-exporting, but what the he11 is FCP doing in the first place? Why is it making all these files then losing/destroying them? < </div>
    That's because the original source media is always required for Compressor's operations. You see copies and renders in the timeline but Compressor will always recreate fresh media so looks for the sources.
    This problem is almost always a user error, though it is not easily debugged from where I'm sitting. It is caused by a few subtle misperceptions about nesting and copying clips from one project to another without taking the source media with them.
    Sorry not to be more definitive, I'd need to have watched you work for several hours to know what's going on with this.
    bogiesan

  • Flat File Active Sync - Notify  admin incase of data processing errors

    Dear Friends,
    We have couple of Requirements to use OOTB flat file active sync adapter
    1. To read data from a flat file and update the records in Sun Identity Manager system
    2. Notify admin if there are any data processing errors while reading data from a flat file. The data processing errors can occur if there is an invalid data. for example, lets say the input flat file has 3 columns defined, but the file conatins records which has four values.
    firstname,lastname,email
    testfirst,testlast,[email protected],12345
    Req#1 is working fine. There are no issues with that.
    Req#2: if the file contains invalid data, i noticed that the active sync adapter throws an Array Index out of bound exception. so, we need to send an email notification to the admin whenever data processing errors occurs.
    I noticed that whenever the data processing exception occurs, the active sync adapter stops processing records and the active sync input form was not triggered. Unless active sync form was triggered, it's very difficult to determine whether the data was read successfully or not.
    Please let me know if there are any configurations/customizations to me made on OOTB flat file active sync adapter to handle data processing errors and send email notifications to administrators.
    Appreciate your help
    Thanks
    Vijay

    Hi,
    We have same requirement that
    "Notify admin if there are any data processing errors from a flat file.
    The data processing errors can occur if there is an invalid data or account is locked etc..."
    In short notify admin if any error logged in Active sync Log file while active sync runs.
    Yes,I noticed same that whenever the data processing exception occurs, the active sync adapter stops processing records and the active sync input form was not triggered. Unless active sync form was triggered, it's very difficult to go ahead to meet the requirement.
    Please let me know if there are any configurations/customizations to me made on flat file active sync adapter to send email notifications to administrators.
    Thanks,
    Sudheer

  • Map Reporting Application - Push Data process creating 40 MB size files

    Hi all,
    We have 6 map reporting applications in our Planning app. 2 of them push data to a BSO cube, 4 of them push data to an ASO cube. For the ASO cube ones, the process creates files that are about 40 MB size in the Essbase server under C:\Windows\Temp folder. The files are encrypted. I can't tell what the content is... The only reason I think the push data process creates these files is because of the date stamp of the files. They match the push data process.
    Have you seen this? Is there a way to stop it? Since we run this process every hour and the files are pretty big, we run out of disk space pretty quickly. Our work around is to purge the files or delete them, but it would be ideal if we can stop this.
    Any help?
    Thanks,
    Mehmet

    Hi Mehmet,
    Which version of planning are you using?
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • How can I perform the conversion of pdf files in Cyrillic script to Word files in Cyrillic script. The pdf file is too small for me to read right now. Julyan Watts

    How can I perform the conversion of .pdf files in Cyrillic script to Word files in Cyrillic script. The .pdf file is too small for me to read without a magnifying glass, and the document is more than one thousand pages.

    This answer was not helpful. First of all, I could not find "tech specs"
    anywhere on the Acrobat 11 homepage. And secondly I purchased this software
    for the specific purpose of converting .pdf files to Word. It was only
    after I had completed the purchase that I learnt that Acrobat does not
    permit the conversion of .pdf files in Cyrillic to Word files  in Cyrillic.
    I feel that Acrobat should have provided this crucial information before I
    allowed my credit card to be debited. That is why I  am now asking for my
    money back. But thanks for your attempt to solve my problem, even if it was
    not successful.
    Julyan Watts

  • I have paid for a year subscription, but all the PDF's I attempt to export to Word give me a "Conversion Error" message, citing that the "file's security settings do not allow export." I want a refund. Adobe, please advise. Thank you.

    I have paid for a year subscription, but all the PDF's I attempt to export to Word give me a "Conversion Error" message, citing that the "file's security settings do not allow export." I want a refund. Adobe, please advise. Thank you.

    Hi la recruiter,
    There are a few types of files that Acrobat can't convert--and PDF files with security applied is one on them. I'm happy to cancel your subscription, if you decide that it's not going to work out for you. Please confirm that you'd like me to proceed with the cancelation, and I'll take care of it right away.
    Best,
    Sara

  • File adapter configuration parameter for "hexadecimal conversion"

    (File adapter for XI 2.0)
    Does anybody know the exact File adapter configuration parameters for reading a file with hexadecimal control characters as seperators. My file has
    3 types of seperators - i.e "2F", "05" & "0D2F"
    Eg: xml.fieldSeparator= "2F"
    When I specify this the file is getting split into fields
    at all the seperators in the file. I guess I am not using the right format in my defination ?
    Dorai

    Is there any special format for reading a file containing hexadecimal control characters?
    The file to be processed has hexadecimal separators:
    Field Delimiter        : '09' or '05'
    End of record delimiter: '0A' or '25'
    End of Table delimiter : '07' or '2F'
    Initially i would like to read the file into XI as it is i.e without any split by using "xml.fieldFixedLengths= "
    However, everytime I process the file(with hexadecimal contol characters), I have noticed that my file is getting split, even when I did not specify anything for field separator(i.e when i used fixed field length).
    Thanks,
    Dorai

Maybe you are looking for

  • What is the best way to consolidate data on two Macs?

    Hi All, About to embark on a small project to move all my data on to my MacBook before finally saying goodbye to my trusty iMac.  In order to do this I'll be upgrading the HDD on the MacBook to a large one and using Time Machine to restore all the or

  • PC to Mac external hard drive question

    Hi All, I have an external hard drive that I used as a backup for pics and music from my old PC.  I've read about there being issues transferring between a PC formatted external HD and a Mac, so wanted to ask the community if there is anything I shou

  • Adding color backgroun to text created

    Is there a way to add a color background to text I created. I am able to add text and then able to add a color background separately, but when I add it , I can't see the text? thank you! Susan

  • Usb communicat​ion

    Hello, I wonder if it is possible to communicate with a microcontroller and a computer using usb. De controller had the usb function on board. I just wonder if it is posible with labview to show the data transmitted over the usb port. Can someone tel

  • Using Two different drivers  simultaneously

    I want to use two different type 4 drivers i.e. one driver of ms-access and one of ms-sql server 7 simultanoeously in one java program.please tell me how can i do this ?