Data Grid, how to avoid unintentional data manipulation

Hi,
Is there a option to avoid unintentional writing in the data grid ?
And if not, please implement such a toggle button for Security reasons.
Andre

Hello K.
thank you for interesting in and care about my mental health.
I'm alive and kicking.
But if I follow your logic the developers of TOAD must have a real big (mental) problem.
They are obviously that kind of paranoid that the say:
“Our data grid is protected always against data modification as long as the user does not explicitly put the rowid to the column list to undoubtedly make clear that they want to manipulate some data.”
I asked them if this is really needed, because in SQL Navigator there is just a simple button to toggle between RW and R only (What I definitely prefer) .
And the said YES – to avoid unexpected / accidentally changes.
Do you find that still strange?
If yes these guys must be more ill then the two of as can imagine.
But I’m very sure that they are alive and kicking too.
By the way - one of my questions remained unanswered:
Did you ever analysed other tools to pick the best feature of them in order to implement they into your tool?
If not, believe me you should. It is worth the effort.
Otherwise it is quite possible that you will take an unfounded amount of time till you will have reached a equivalent level.
Or do you want to be behind the others for eternity?
If you had the task to develop an new car would it ever come into your mind to build it with without a backward gear and simply to wait until someone requests something like that.
Some (useful) standard has been set before you came up.
What about a little less self satisfaction and a little bit more respect.
That’s my advice for you (also) beyond the sqldev tool .
Regards
Andre

Similar Messages

  • How to avoid clear data in the block in a form

    Hi All
    I am using Oracle form 10g and db 10g.
    My form i have a list item in my main canvas. It has a value of (EXPENSE,AMOUNT,SUPPLIER,ACCOUNT).
    The user used to select the list item one by one and used to enter values in the each canvas assigned to the Item(expense,amount,supplier,account) and go for a save.
    For instance when the user enter the value in expense ,amount and supplier and again back when the user navigate back to amount the entered value get cleared.
    how to avoid the data get cleared.
    Thanks & regards
    Srikkanth

    you can use EXECUTE_QUERY built-in to populate the data when that canvas or block instance. If it need at block level write this at When New Block Instance. if it a tab page then you can write when tab page change. If need where condition, add block level where condition.

  • How to avoid invalid data entering in LOV through code

    hi
    1)i have developed lov in table region, but user easily can enter invalid data and saved into the database tables.
    2)i created one formvalue and mapping into that return item , still its not working in table region LOV.
    3)how to avoid invalid data entering in LOV through code. i have tried this below code in EOimpl set value method. but some how its not wokring.
    if (value!=null)
    throw new OAAttrValException(OAAttrValException.TYP_ENTITY_OBJECT,
    getEntityDef().getFullName(),
    getPrimaryKey(),
    "ProcurementCategory",
    getProcurementCategory(),
    "FND",
    "FND_LOV_SELECT_VALUE");
    Thanks.
    krish.

    Thanks reetesh and gourav for your help.
    i followed below mapping details
    LOV Item Properties
    ID -PurcCommodity
    ViewInstance -VendorVO1
    ViewObject -PurcCommodity
    map1 properties
    LOV Region Item - segment1
    Return Item -PurcCommodity
    CriteriaItem -PurcCommodity
    Usefor Validation -Default
    map2 properties
    LOV Region Item - segment1
    Return Item -validation(formvalue)
    CriteriaItem -
    Usefor Validation -yes
    form value properties
    ID -validation
    ViewInstance -VendorVO1
    ViewObject -PurcCommodity
    Gourav- i double checked multiple rows it is not working, some times it is not working for single row too.
    Thanks
    krish.

  • How to avoid duplicate data while inserting from sample.dat file to table

    Hi Guys,
    We have issue with duplicate data in flat file while loading data from sample.dat file to table. How to avoid duplicate data in control file.
    Can any one help me on this.
    Thanks in advance!
    Regards,
    LKR

    No, a control file will not remove duplicate data.
    You would be better to use an external table and then remove duplicate data using SQL as you query the data to insert it to your destination table.

  • Pluggin container makes flash player crash. all is up to date. How to avoid this?

    Question
    pluggin container makes flash player crash. all is up to date. How to avoid this?

    First, download and run the Flash uninstaller: [http://kb2.adobe.com/cps/141/tn_14157.html http://kb2.adobe.com/cps/141/tn_14157.html] . You probably want the 64-bit version. After that has run, restart your computer, and then let's download a fresh version of Flash. Try downloading and installing it from [http://fpdownload.macromedia.com/pub/flashplayer/current/licensing/win/install_flash_player_11_plugin_32bit.exe here].
    Once you have flash installed again, start Firefox up, and see if you are getting any errors. If it works, awesome, if not, let's move on.
    Start Firefox up in [[Safe Mode|Safe Mode ]] (don't select any of the checkboxes that appear). If Flash works here, then it is one of your addons which is causing a problem.
    If we are still having a problem, try [[Updating your graphics driver|Updating your graphics driver]] .
    If none of these work, read [[Troubleshooting plugins|Troubleshooting plugins]] and let me know!

  • How to avoid seeing data for a particular member in the row level

    Hello,<BR>I am using Essbase. In that i am having members like taks n/a, demand n/a. Purpose of these members are, if i load data for demand, task wont have data so i am mapping to task n/a.And same for the task loading.I am using alphablox for reporting and report script for retriving the data.So, if i select the task n/a and demand n/a then some value will be existing for this. Is there any possibility to remove this from report.I mean while seeing in the report, it should not get this member and respective value for display and also avoid taking this data for any manipulation.I know RESTRICT for column level data. Is there anything for row level restrictions.<BR>Please help me on this.<BR>Regards<BR>R.Prasanna

    Hi,
    Into the rule containing the Color metadata, you can restrict the values of the list.
    In "Has restricted list and pane", list the values 1, 2 and 3 (new line for each value) :
    1
    2
    3
    Romain.

  • How to Avoid overlapping data label values in Pie Chart

    Hi,
    I am facing the problem when the data is more my pie chart data label value is overlapping.
    I tried with showing outside the data label value but customer is not accepting ,and i used the CollectedPie  option also but still its overlapping .So please any body knows how to resolve this problem as i need very urgent basis.
    Regards,
    HariKan
    HariKan

    Hi HariKan,
    Per my understanding that the Category group of the pie chart which will retuen many values so that the label will overlapping and you want to know is any method to deal with this kind of problem, right?
    In Reporting Services, when enabling data label in par charts, the position for data label only have two options: inside and outside.
    In your scenario, I recommend you to increase the size of the pie chart if you insist to choose the lable inside the pie chart as below:
    If you choose to "Enable 3D" in the chart area properties and choose to display the label outside, the label's layout will be more clear:
    Reference:
    Pie Charts (Report Builder and SSRS)
    Position Labels in a Chart (Report Builder and SSRS)
    If you have any question, please feel free to ask.
    Best regards,
    Vicky Liu
    Vicky Liu
    TechNet Community Support

  • How to avoid losing data when communicat​e with high speed motor?

    I connect with high speed servo motor via RS232. To avoid losing data. i thought to set receive buffer, only read the buffer if it collected all bytes. Is this possible?

    Hi,
    If you know the number of bytes you are trying to read, you can set a viRead call to return information once the particular number of bytes have been read.  For more information on this, take a look at the KnowledgeBase article on a Serial VISA Read to read a requested number of bytes. 
    Even if you read before all bytes have been collected, you should not lose data.  When the specified number of bytes are stored in the buffer, the viRead call will send the information to the program, and new data coming in will be stored in the buffer until the byte count is reached again.
    I hope this helps,
    Lauren L.
    Applications Engineering
    National Instruments

  • Date sync: how can I sync dates from 2 cameras

    hi
    how can I sync dates from 2 cameras. From a wedding, my camera had the correct date, the borrowed one was about 11hrs out but not exactly. I want to change all the pics from the 2nd camera to sync with the first. I've already imported.
    Many thanks

    Use "Metadata→Adjust date and time".
    Instructions in the User Manual here.  Not that the change is an offset to the time-stamp listed in the EXIF.
    Message was edited by: Kirby Krieger

  • Using GETGUI command --in ALV grid, how to extract multiple DATA

    Using GETGUI command , I am able to get a single value form the ALV grid. Please explain me how I can read multiple data like rows and columns data from the ALV grid.
    Please be detailed as I already tried selecting the whole block of ALV grid, but I could not help myself.
    Regards
    Srinivas.

    Hi Srinivas,
    You will have to use the concept of Regular expressions for this. We will have to loop through each row/coloumn to do what u desire.
    The ID of an element on grid will be something like GRIDNAME-<ELEMENT NAME>[ROW NO] [COLUMN NO]
    we need to parameterize the row and also the el;ementname changes for each column along with the column number. Please let me know if this much detail is enough. If not i can show u a real time ecatt code snippet on how to play around with grids.
    Regards,
    Justin

  • Data Pump - How to avoid exporting/importing dbms_scheduler jobs?

    Hi,
    I am using data pump to export a users objects. When I import them it also imports any jobs that user has created with dbms_scheduler - how can I avoid this. I tried EXCLUDE=JOBS but no luck.
    Thanks,
    Jon.
    Here are my export and import paramater files:
    DIRECTORY=dpump_dir1
    DUMPFILE=reveal.dmp
    CONTENT=METADATA_ONLY
    SCHEMAS=REVEAL
    EXCLUDE=TABLE_STATISTICS
    EXCLUDE=INDEX_STATISTICS
    LOGFILE=reveal.log
    DIRECTORY=dpump_dir1
    DUMPFILE=reveal.dmp
    CONTENT=METADATA_ONLY
    SCHEMAS=reveal
    REMAP_SCHEMA=reveal:reveal_backup
    TRANSFORM=SEGMENT_ATTRIBUTES:n
    EXCLUDE=TABLE_STATISTICS
    EXCLUDE=INDEX_STATISTICS
    LOGFILE=reveal.log

    Sorry for the reply to an old post.
    It seems that now (10.2.0.4) JOB is included in the list of SCHEMA_EXPORT_OBJECTS.
    SQL> SELECT OBJECT_PATH FROM SCHEMA_EXPORT_OBJECTS WHERE object_path LIKE '%JOB%';
    OBJECT_PATH
    JOB
    SCHEMA_EXPORT/JOB
    Unfortunatly, EXCLUDE=JOB still generates invalid argument on my schema imports. I also don't know whether these are old style jobs, or scheduler jobs. I don't see anything for object_path LIKE '%SCHED%' , which is my real interest anyway.
    The data pump is so rich already, I hate ask for more, but ... may we please have even more?? scheduler_programs, scheduler_jobs, scheduler etc.
    Thanks
    Steve

  • How to avoid doubling data

    Hi experts,
    i loaded a flatfile1 into an ODS and then into an infocube. Again i loaded another flat file2 ( which had the same records as well as few more records as first one) the same way.. when i look at the Infocube, some of the records have double the values. How do i avoid this.. can i go for delta load from ods to infocube..
    all the above loads are fulupdate.
    Also SDNers,
    By full update do we get all the data again if we load from ODS even if we laoded previously.?
    thanks
    Dave
    Message was edited by: Dave Marcus

    If you go as Delta from ODS to Cube, ODS will take care in sending only the changed records and the new records.
    see this link:
    Re: Delta Processing in ODS
    Also see:
    ODS & Delta
    Thanks,
    Raj

  • How to avoid automatic data change in service notification from work order

    Hi,
    when I change work order address data the address data from the service notification connected to the order is changed too, how can I avoid this change? Is there anything on customizing to control whichs fields are updated automatically.
    Thanks,
    Juan.

    Hi,
    You can trying using Books of Business.
    Create a book and add that particular user to that book and at the time of task creation, through a workflow you can assign it to the corresponding book.
    This might resolve the access issue you are facing.
    hope it helps,
    Mayank

  • How to avoid 'duplicate data record' error message when loading master data

    Dear Experts
    We have a custom extractor on table CSKS called ZCOSTCENTER_ATTR. The settings of this datasource are the same as the settings of 0COSTCENTER_ATTR. The problem is that when loading to BW it seems that validity (DATEFROM and DATETO) is not taken into account. If there is a cost center with several entries having different validity, I get this duplicate data record error. There is no error when loading 0COSTCENTER_ATTR.
    Enhancing 0COSTCENTER_ATTR to have one datasource instead of two is not an option.
    I know that you can set ignore duplicates in the infopackage, but that is not a nice solution. 0COSTCENTER_ATTR can run without this!
    Is there a trick you know to tell the system that the date fields are also part of the key??
    Thank you for your help
    Peter

    Alessandro - ZCOSTCENTER_ATTR is loading 0COSTCENTER, just like 0COSTCENTER_ATTR.
    Siggi - I don't have the error message described in the note.
    "There are duplicates of the data record 2 & with the key 'NO010000122077 &' for characteristic 0COSTCENTER &."
    In PSA the records are marked red with the same message (MSG no 191).
    As you see the key does not contain the date when the record is valid. How do I add it? How is it working for 0COSTCENTER_ATTR with the same records? Is it done on the R/3 or on the BW side?
    Thanks
    Peter

  • How to avoid duplicate data loading from SAP-r/3 to BI

    Hi !
           I have created one process chain that will load data into some ODS from R/3,where(in R/3)the datasources/tables r updated daily.
           I want to scheduled the system such that ,if on any day the source data is not updated (if the tables r as it is) then that data shuold not be loaded into ODS.
           Can any one suggest me such mechanism,so that I can always have unique data in my data targets.
           Pls ! Reply soon.
          Thank You !
           Pankaj K.

    Hello Pankaj,
    By setting the unique records option, you pretty much are letting the system know to not check the uniqueness of the records using the change log and the ODS active table log.
    Also, in order to avoid the problem where you are having dual requests which are getting activated at the same time. Please make sure you select the options "Set Quality Status to 'OK' Automatically" and "Activate Data Automatically" that way you would be having an option to delete a request as required without having to delete the whole data.
    This is all to avoid the issue where even the new request has to be deleted to delete the duplicate data.
    Untill and unless the timestamp field is available in the table on top of which you have created the datasource it would be difficult to check the delta load.
    Check the table used to make sure there is no timestamp field or any other numeric counter field which can be used for creating a delta queue for the datasource you are dealing with.
    Let me know if the information is helpful or if you need additional information regarding the same.
    Thanks
    Dharma.

Maybe you are looking for

  • Can't Publish to oradav on Oracle Portal 9.0.4

    I've set up the <url portal>/dav_portal/portal so I'm able to publish to it by dragging and dropping files from Windows Explorer, and everything works fine. Then I set up BI Publisher version 10.1.3.3.2 and the scheduler works fine with e-mails and t

  • How do you connect a usb printer and usb external hard drive to the airport extreme, as it only has one usb port?

    I'm thinking about upgrading my old 802.11g router for a new airport extreme. Can it connect to a usb printer and external hard drive at the same time and acutally work. It only has one usb port on the back. thank you!

  • White Balance forgets setting

    Hi, I have this wacky issue where I'll edit the white balance in a photo and the preview will update like it should. Then sometimes I come back to the photo, and I can see in the thumbnail preview thing that it's the way I remember doing it, and then

  • Can I move the time machine drive from being a usb drive

    Hello   I am not sure whether this fits best in the time capsule or in the time machine forum. I try here as this a issue thar arises only to time capsule users. I have a new USB drive that I want to use for time machine backups. Eventually I want it

  • Change locZ

    Hey, Would be really thankful if anyone could answer this easy question. On my stage I have a shockwave scene and a imported flash movie. I want the flash movie to appear over (in front) of the shockwave scene. I have tried to change the LocZ to high