OData Refresh from On-Premise Datasource
Hi
In Office365 PowerBI Admin centre I have built a connection to our on-premise data warehouse, i.e. installed a gateway, created the connection, enabled cloud access and OData feed and added some views.
My colleague has built an PowerView spreadsheet using the OData feed and published the spreadsheet to our PowerBI portal.
I have scheduled a daily refresh of the spreadsheet but every time it runs it returns the following error:
Failure Correlation ID: 35e739a4-e45d-4225-a442-ffadc88fe33c (this guid changes every time)
A connection could not be made to the data source with the DataSourceID of '74259462-....-....-....-249ca7edc4c5', Name of
'feedname'. An error occurred while processing table 'tablename'.
I have tried different credentials in the connection, currently it uses a SQL login but I have also tried a domain login, to no avail.
Can anyone point me in the right direction for sorting this out?
TIA
Julian
Now I'm confused!
We are using an ODATA feed from the data source and as such, the connection information in the workbook is a URL: the provider is Microsoft Data Feed Provider and the connection string is:
Source=https://sitename.hybridproxy.powerbi.com/ODataService/v1.0/DataSource_Name;Service Document
In the datasource properties in PowerBI the OData feed URL is
https://sitename.hybridproxy.powerbi.com/ODataService/v1.0/DataSource_Name
The data source itself is pointing to our gateway and uses the Microsoft OLE DB Provider for SQL Server.
I have another data source that uses the same gateway that I created by extracting the PowerBI connection string from a spreadsheet that connected directly to the database. This data source connection uses the .Net Framework Provider for SQL
Server. Published spreadsheets that use this data source will refresh.
Could the provider be the issue?
Similar Messages
-
Copying multiple tables from On Premises SQL to Azure Blob
Hello Team,
Using ADF, I wanted to move data from On Premises dataset having multiple tables to Azure Blob. Can you please provide some help or share some samples ?
Thanks,
EshetuYou would follow this sample:
http://azure.microsoft.com/en-us/documentation/articles/data-factory-use-onpremises-datasources/
Essentially you use a Data Management Gateway service to live on premises to talk to SQL Server or to NTFS file system, to copy up to WASB blob storage.
I don't know the bit about multiple tables in one JSON document, but I believe it can process a folder of files by giving the folder path instead of the full file name. You may have to use multiple data sets to copy multiple tables.
Thanks, Jason
Didn't get enough help here? Submit a case with the Microsoft Customer Support teams for deeper investigation - Azure service support: https://manage.windowsazure.com/?getsupport=true For on Premise software support go here instead: http://support.microsoft.com/select/default.aspx?target=assistance -
Error while consuming Odata service from Gateway client i.e /iwfnd/gw_client
Hello Experts.
I am facing below an error while consuming the Odata service from GW client... The error is "No service found for the namespace /IWFND/,name ZTEST_STOREROOM_SRV,version 001". Even i have tried to deep dive in /IWFND/Error_log but no use..
Actually what I was doing :- My aim is to connect multiple back end systems in the same server with the help of Aliasing concept.. I have created multiple aliases and added in the /IWFND/MAINT_Service transaction.. but i am not getting how to consume the service..
I have followed the solution upto some extent in the link => Multiple Origin Composition - SAP NetWeaver Gateway Foundation (SAP_GWFND) - SAP Library
Can you please let me know how to resolve this.. Also please let me know, the syntax for the URI...
Your help is highly appreciated..
Please find the screenshot attached.
Thanks,
Srinivas.Hello @Nrisimhanadh_Yandamuri
Thanks for your reply..
I have got all the required authorization.. But still I am not able to hit the service.. Please let me know what could be the solution..
Thanks,
Srinivas. -
Dear Bhudev/Guest,
I am Planning to refresh my Quality system from Production's Offline Database Backup. I have seen you a lot posts.
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Refresh from Offline backup (AQC from APC)
Let's say AQC is quality and APC is production.
Pre-processing Activity:
Verifications:
1. Verify the last offline backup in the source system /oracle//sapbackup/back.log
2. Verify that both the source and target systems are at the same Oracle level (which is already the case between APC and AQC)
3. It is recommended that an off-line backup be taken of the target system before starting this process, in case of fall back.
4. Confirm there is enough space in the target (AQC) file systems to contain the data from the source system this is required as we may have to copy the files from Production to quality. /oracle/SID/sapdata* folders should be having little more space as compared to the source filesystem.
5. Check access to SAP* for APC (000 and 400) and also for AQC (000 and 400)
6. Check other passwords such as adm and ora
7. Server : User : ora > sqlplus u201C/ as sysdbau201D > alter database backup controlfile to trace > exit In /oracle/SID/saptrace/usertrace, the last trace (check the time) is the script for recreating the control file. You will need to edit it. Please rename the trace file to ld.sql and send the file to the target system /oracle/SID Edit the Script: vi ld.sql Delete or comment out all lines at the beginning of the file before STARTUP NOMOUNT which is kept. Change old controlfile creation statement at the beginnig of the controlfile: Old line: create controlfile reuse database NOresetlogs archivelog; change to New line: create controlfile set database resetlogs archivelog; replace all occurences of string by in each line where appears :g / / s / / /g example: :g/P10/s/P10/T21/g Delete the following lines a the end of the script: ; RECOVER DATABASE; ALTER SYSTEM ARCHIVE LOG ALL; ALTER DATABASE OPEN;
8. Get from the Prod system (APC) the .aff and backDPR.log and copy them on AQC. (/oracle//sapbackup) Copy the files to /instkits/SID/. And after we can go on target source server and copy them from /instkits/SID to the /oracle/SID/sapbackup/. With the good owner and permission
9. Lock all the users in client 400 except Basis and post system message. Target system data collection (On AQC -400):
1. SM59 Expand all the trees, print by choosing: ->System ->List ->Print i) Compare list to R/3 connections on source system. Record details for any that will need to be recreated on target system.
2. SCC4 Record all client settings. -> Tableview ->Print ->Standard List -> List -> Print Make sure you understand the columnsu2026
3. SE06 Record system change options
4. RZ10 Record the instance profile settings for: rdisp/wp-no-btc and rdisp/max_wprun_time
5. STMS Record current transport routes. ->Overview -> Transport Routes -> manually expand all trees -> system ->list ->print
6. Request customer to provide u201Ccatch-upu201D list of transports to be applied to system after refresh (QA and Sandbox systems).
7. RZ04 Record current work process set-up. Highlight each operation mode and click instances/op modes button. i) Print with ->System ->List ->Print
8. SAP License Ensure the SAP license information is available. If not on-hand, logon target system as adm and enter command: saplicense u2013show. This information is also available on OSS.
9. TSM Nodename View /usr/tivoli/tsm/client/api/bin64/dsm.sys on source and target system. Record the nodename entries. These will be SAP or server names.
11. Take printer Export through SPAD and save it locally to desktop->SPAD
12. Take Export of User Master from the client 400 through SCC8 target system is AQC , so that it may be re-imported through STMS. Note the Transport number here also verify the transport exist in the queue. DB Restore from offline backup from APC:
1. Stop SAP on AQC -> Login as aqcadm -> stopsap
2. Change the TSM node on AQC-> cd /usr/tivoli/tsm/client/api/bin64 ls -lrt vi /usr/tivoli/tsm/client/api/bin64/dsm.sys Change it to point to APC, comment the AQC Node. Post Restore Acitivity:
1. Stop Database Delete Old controlfile rm /oracle/AQC/sapdata1/system_1/cntrl/* rm /oracle/AQC/saparch/cntrl/* rm /oracle/AQC/origlogA/cntrl/*
2. Re-create the controlfile and then after that start the database User : ora > cd > sqlplus u201C/ as sysdbau201D > @ld.sql > alter database open resetlogs;
3. Change the sapr3 password Server : User : ora Temporarily, we have to change the sapr3 password to the default password. With SAPBA, change this user password to pass.
4. Test the connection: Server : User : adm > R3trans u2013d It should have a return code of 0000
5. Reasign tempfile alter tablespace psaptemp add tempfile '/oracle/AQC/sapdata1/temp_1/temp.data1' size 6020M reuse autoextend on next 20971520 maxsize 10000M; alter database rename global_name to AQC.WORLD; drop user OPS$APCADM cascade; drop user OPS$ORAAPC; create user OPS$ORAAQC identified externally default tablespace psapdnausr temporary tablespace psaptemp; grant connect, unlimited tablespace, SAPDBA to OPS$ORAAQC; grant connect, unlimited tablespace, SAPDBA to OPS$AQCADM;
7. Change the instance profile to stop all the background processing To be sure that no production batch are executing, set rdisp/wp_no_btc to 0 posprocessing:
8. Startsap
9. Run Transaction SICK (as sap*)
10. Installing license through the tcode: slicense While logged in as 000/sap/using password of APC system Get the License from service.sap.com and download it to local machine Log in 000 client using SAP and run the Tcode slicense import printers, run BDLS for logical system name conversions se06 posprocessing stms usermaster import import profiles for QAS
thanks
Bhudev
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Hi Pranav
why the SIDs of production and QAS are same, did you do refresh of QAS from PRD ?
If yes, then you should have done control file creation and change of SIDs, after the datbase restore...
however, if you want to create the new control file and change the SID then try to consider the following steps:
1. Login as orasid
2. connect to sqlplus
3. run query: alter database PRD backup controlfile to trace;
4. It will create a control file as a latest trace file (.trc) in the directory /oracle/PRD/saptrace/usertrace
note: logout of sqlplus
5. Rename that .trc file into ldQAS.sql
6. copy this file into /oracle/SID ie. /oracle/QAS directory
7. edit the .sql file as below
8. CREATE CONTROLFILE SET DATABASE u201CQASu201D RESETLOGS ARCHIVELOGS;
add above statement replacing the below:
CREATE CONTROLFILE REUSE DATABASE u201CQASu201D NORESETLOGS ARCHIVELOGS;
9. Replace all occurences of string <PRD_SAPsid> by <QAS_SAPsid> in each line where <source_SAPsid> appears
:g/PRD/s/PRD/QAS/g
10. login in sqlplus, stop database
11. as oraqas, delete old control files:
rm /oracle/SID/sapdata1/system_1/cntrl/*
rm /oracle/sid/saparch/cntrl/*
rm /oracle/SID/origlogA/cntrl/*
12. again login into sqlplus (as sysdba always)
13. run this .sql : @ldQAS.sql
note: it should give a message that control file creaated.
14. Open the database using: alter database open resetlogs;
15. alter database rename global_name to QAS.WORLD;
16. drop user OPS$PRDADM cascade;
17. drop user OPS$ORAPRD;
18. create user OPS$ORAQAS identified externally default tablespace <tablespacename>
temporary tablespace psaptemp;
19. grant connect, unlimited tablespace, SAPDBA to OPS$ORAQAS;
grant connect, unlimited tablespace, SAPDBA to OPS$QASADM;
at SAP level:
1. you can run BDLS to convert logicalsystems from PRD to QAS SID
Bhudev
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
In both post, There is some difference specially like User droping.
Whether I have to run BDLS or Drop usr and create or both activity have to be done?????
Regards
DharmendraPlease explain step by step:-
7. Server : User : ora > sqlplus u201C/ as sysdbau201D > alter database backup controlfile to trace > exit
In /oracle/SID/saptrace/usertrace, the last trace (check the time) is the script for recreating the control file.
You will need to edit it. Please rename the trace file to ld.sql and send the file to the target system /oracle/SID
Edit the Script: vi ld.sql Delete or comment out all lines at the beginning of the file before STARTUP NOMOUNT which is kept.
Change old controlfile creation statement at the beginnig of the controlfile: Old line: create controlfile reuse database NOresetlogs archivelog; change to New line: create controlfile set database resetlogs archivelog; replace all occurences of string by in each line where appears :g / / s / / /g example: :g/P10/s/P10/T21/g Delete the following lines a the end of the script: ; RECOVER DATABASE; ALTER SYSTEM ARCHIVE LOG ALL; ALTER DATABASE OPEN;
8. Get from the Prod system (APC) the .aff and backDPR.log and copy them on AQC. (/oracle//sapbackup) Copy the files to /instkits/SID/. And after we can go on target source server and copy them from /instkits/SID to the /oracle/SID/sapbackup/. With the good owner and permission
Break into steps and explain. It looks confusing to me because I am new to ECC with Oracle.
Regards
Dharmendra -
URGENT help needed:Address data missing after QA Refresh from PRD
All,
Address data for almost all user-ids are missing after QA Refresh from PRD.
In QA, after importing the User-Master although its shows successful. The detailed log shows:
Data inconsistency in USR21. Start RSADRCK2 (See Note 459763)
Exit program AFTER_IMP_ADDRESS3 successfully executed
SAP user has no address SAP*
Error while deleting ADRVP for SAP*
SAP user has no address SAPCPIC...
ERROR: Type "F" user exit with SYS_ERROR: SUSR_CLIENTCOPY_USERBUF_RESET
We also do a Table export - import wherein the tables
USR03
USR07
USR09
USR20
USR21
USR30
are included.
The no. of entries exported and imported are same.
Also FYI in the User-master Transport i can see the following Tables included in the object list
USR01
USR02
USR04
USR05
USR06
USR08
USR14
USR21S
USR22
USRACL
USREXTID
USREXTIDH
Has anyone seen this before?
Any body has any ideas?Hello Bidwan,
I think it is an issue with company address. Just check if company addresses are existing the source client ?After client copy company addreses of target client will only exist in source client. Then if you do impot of the transport containing USR* tables it will try to assign old company addresses to the users but probably they are not exisitng in target client any more.
If this is the case then you need to create those company addresses again using SUCOMP and then once again import the transport for user master.
Regards.
Ruchit. -
Database Refresh From ASM Filesystem to Local Filesystem
Hi ALL,
I am performing a database refresh from production server to a demo server. Our Production database is 11.2.0.1 and it is using ASM filesystem to keep the data, redo and other files in ASM disks.
On the other hand demo server is not having ASM, all the database files are stored in a local filesytem.
I have taken a fresh backup of our production database, but I am not sure to perform the restore part as the demo server is not having ASM.
Can anyone suggest me how to perform this, I mean datafile restore from ASM to local filesystem.
Any usefull links will be helpful.
Regards,
ArijitHello,
You can restore the backup of your Production database which is using ASM to your demo server (using file system).
Make sure that the control_files parameter in the pfile/spfile is pointing to the file system location where you want to have contol files located on the demo server.
Next, before you use the restore command to restore the database, provide the location of the datafile where you need to restore using the "set newname" clause.
run
set newname for datafile 1 to '<file-system-location-on-demo-server>';
set newname for datafile 2 to '<file-system-location-on-demo-server>';
restore database;
switch datafile all;
recover database;
} -
Database refresh from cold backup and hotbackup.
How can we perform Database refresh from cold backup and hotbackup?
OracleM wrote:
How can we perform Database refresh from cold backup and hotbackup?I understand that you have Cold/hot backup and you need recover(refresh) this backup?If yes then restore cold/hot backup and
sqlplus "/as sysdba"
startup mount;
recover database using backup controlfile unil cancel;
/*then apply all available archive logs*/
alter database open resetlogs; -
Database refresh from production to test -how to clean existing test env
All,
My environment is
Both Production and Test databases are in two node RAC environment
Oracle version - 10.2.0.4.0
OS - RHEL5
Production database size 80GB
We need to refresh the test environment from production database. Complete objects, data etc should be refreshed.
We have a datapump export from production environment. With this export dump from production environment, I need to import into test environment.
So far, I have imported with this kind of dump to the fresh database only.
Now, we have already objects, data sitting in the test environment. How to clean the existing test environment and refresh the production datapump export dump.
I thought of dropping all the tablespaces in test (other than system,sysaux, undo and temp). But not able to drop few tablespaces, it is telling that index is available in other tablespaces, dependency errors etc.
Which is the best method to clean the existing test database. Management is not interested in dropping the test database and recreating it.I understand that you are Newbie , let give me simple steps.
Follow the steps on testing envi.
1. Drop only Application users that you want to refresh from Prod(Do NOT drop users system,sysaux.. or tablespaces)
2. Create the users that you dropped.
3. using import or import data pump import the data.
In case you want to import user"A" data to "B" use REMAP_SCHEMA option.
See the below link for data pump export/import with examples.
http://oracleracexpert.blogspot.com/2009/08/oracle-data-pump-exportimport.html
Hope this helps.
Regards,
Satishbabu Gunukula
http://oracleracexpert.blogspot.com
Click here for [How to add and remove OCR|http://oracleracexpert.blogspot.com/2009/09/how-to-add-and-remove-ocr-oracle.html]
Click here in [ Making decisions and migrating your databases |http://oracleracexpert.blogspot.com/2009/08/download-sql-developer-migration.html]
Click here to lean [ Static parameters change in SPFILE and PFILE|http://oracleracexpert.blogspot.com/2009/09/how-to-change-static-parameters-in.html]
Edited by: Satishbabu Gunukula on Sep 14, 2009 5:09 PM
Edited by: Satishbabu Gunukula on Sep 18, 2009 10:35 AM -
System refresh from Production to Quality
Hi,
We are going for system refresh from Production to Quality. We are at SAP NetWeaver 2004s with 700 release and at 0021 level. Our data base system is DB6 with the release 09.07.0000
I understand that there is a note 886102 available for the system copy. But I would like to know how that had been practically implemented from your ready documents like
1. What would be the BI consultant role during the refresh (I mean where do we involve at). I have seen many links related to this but nothing answer my question, so please don't give me links available.
2. How to identify tables that need to be copied and restored to retain the correct source systems for data/info sources.
3. What should be the BI consultant task before refresh?
4. What should be the BI consultant task post refresh?
5. What are issues faced post refresh in quality system.
I request, the consultant who had worked on these refresh can provide me correct solution.
Thanks in advance.
Regards.
RajOur prerefresh activities included
Inform security team to do no user or authorisation changes for quality during the refresh.
Set message in development to not release any transports anymore and set message in quality to inform users in quality not to manually import transports into quality and also not approve transports for production. This ensures no transports get moved to quality and production.
Switch off cyclic import all job (like TMS_0000000038TMS_TP_IMPORT) and the cyclic RSTMS_DIST_APPROVED_REQUESTS job
Prepare list of transports for re-import to quality after refresh and give this to BASIS.
Post refresh activities included
Tcodes SM37, SM35 and SP01. Check that BASIS had set all released jobs to status "Susp/Released"
"All jobs are in 'Susp/Released' state. Set them all to 'Scheduled ' as follows:
- Run report BTCTRNS2 to change all to 'Released'.
- Immediately use SM37 to change all to 'Scheduled' "
IF ANY ARE NEEDED. Remember to change Exec Target in any job you need to release.
"Schedule RSTMS_DIST_APPROVED_REQUESTS to run at x:29 and x:59 - so every 30 minutes.
Please schedule with DDIC as step user (and not your own user-id)."
Check the STMS_QA and import queues to be sure that the tranports are correct- no extra ones during refresh???
Once happy with the above request that Basis schedule the auto import to run every 30 minutes
First ensure that BDLS has finished and system is ready for use.
Post refresh issues faced in production
Many reinit issues
ACR issues.
Master data issues. -
HI All,
I am trying to create a block that base on FROM CLAUSE
as datasource. But I need to reference several control block items as the criteria of the select statement.
But when I execute the query, the system keep saying that 'Invalid column name'.
I want to know is that I cannot reference block item in the 'from clause query select statement' ?
Thanks!!!thanks!
I am ok now, I want to change the where clause dynmically.
I dynamiclly change the 'Datasourse Name' property in order to achieve this.
Thanks again! -
Unable to refresh from profile
When I attempt to connect a Solaris 10 Update 4 client to my Sun iPlanet Directory Server 5.2 LDAP server, using the following configuration:
NS_LDAP_FILE_VERSION= 2.0
NS_LDAP_SERVERS= 172.16.20.200
NS_LDAP_SEARCH_BASEDN= ou=SciDevGrid,dc=c-noofs.gc.ca
NS_LDAP_AUTH= simple;sasl/DIGEST-MD5
NS_LDAP_SEARCH_REF= FALSE
NS_LDAP_SEARCH_SCOPE= sub
NS_LDAP_SEARCH_TIME= 15
NS_LDAP_CACHETTL= 86400
NS_LDAP_PROFILE= SciDevGrid
NS_LDAP_CREDENTIAL_LEVEL= proxy
NS_LDAP_SERVICE_SEARCH_DESC= passwd:ou=People,ou=Default,dc=c-noofs.gc.ca
NS_LDAP_SERVICE_SEARCH_DESC= group:ou=group,ou=Default,dc=c-noofs.gc.ca
NS_LDAP_BIND_TIME= 10
It works fine, and all information the client needs from the server is fully accessible. I can log in as an LDAP user, and the hosts table and automounter are updated properly. However, I receive this error message in the messages log when ldap_cachemgr tries to update the profile:
Aug 26 11:32:57 CNOOFS01 ldap_cachemgr[21216]: [ID 722288 daemon.error] Error: Unable to refresh from profile:SciDevGrid. (error=2)
The cachemgr log file (/var/ldap/cachemgr.log) shows:
Tue Aug 26 11:32:57.7150 Starting ldap_cachemgr, logfile /var/ldap/cachemgr.log
Tue Aug 26 11:32:57.7262 sig_ok_to_exit(): parent exiting...
Tue Aug 26 11:32:57.7344 Error: Unable to refresh from profile SciDevGrid (error=2)
Tue Aug 26 11:32:57.7344 Error: Unable to update from profile
The SciDevGrid profile in the LDAP server (cn=SciDevGrid,ou=profile,dc=c-noofs.gc.ca) has the following configuration:
(the passwd and group ou's are inherited from the default profile via serviceSearchDescriptor entries)
cn = SciDevGrid
authenticationMethod = simple;sasl/DIGEST-MD5
bindTimeLimit = 10
credentialLevel = proxy
defaultSearchBase = ou=SciDevGrid,dc=c-noofs.gc.ca
defaultSearchScope = sub
defaultServerList = 172.16.20.200
followReferrals = false
objectClass = DUAConfigProfile
objectClass = top
profileTTL = 86400
searchTimeLimit = 15
serviceSearchDescriptor = passwd:ou=People,ou=Default,dc=c-noofs.gc.ca
serviceSearchDescriptor = group:ou=group,ou=Default,dc=c-noofs.gc.ca
If I go into the LDAP server and modify the SciDevGrid profile to give defaultSearchBase the value "dc=c-noofs.gc.ca", the cachemgr error messages will stop. That leads me to believe this is a configuration problem. Has anyone run across this problem before?
Basically, what I am trying to achieve is a "default" profile which contains users and groups which apply to all LDAP-connected machines. Other profiles will inherit these attributes from the default profile, while providing additional configuration (automount maps and host table) specific to each compute grid we have.
Thanks in advance for any advice you can provide,
Adam Lundrigan
UNIX SysAdmin, C-NOOFS project
Biological and Physical Oceanography Section, Science Branch
Department of Fisheries and Oceans Canada
[email protected]
Edited by: C-NOOFS on 27-Aug-2008 9:48 AMI am having the same issue. Does anybody can help?
-
Texts from non-infoobject datasource
Hi all.
In SAP source system I got table with additional attribs for CUSTOMER:
CUST_STATUS
CUST_STATUS_TEXT
CUST_WAREHOUSE_STATUS
CUST_WAREHOUSE_TEXT
I've created data source for the table. Z_CUST_ENHANCE
I've created CUST_STATUS and CUST_WAREHOUSE_STATUS infoobjects with texts. I got duplicate key error messages while loading data directly from the Z_CUST_ENHANCE datasource (whic is logic).
How I'm supposed to load texts and keys into CUST_STATUS, CUST_WAREHOUSE_STATUS info-objects?Hi,
Try this method:
Make those objects as master data infoobjects vt texts.\
Now create 4 DS for loading those IDs and Texts into those objects.
While loading kane use of direct update.
Revert back for further queries.
Regards,
Rajknadula -
"Refresh from [myConnection]" on offline table
hi
Please consider this scenario.
(1) in SQL*Plus, "create table my_table(col_a varchar2(10), col_b varchar2(10));"
(2) in JDeveloper (10.1.3.3.0), drop "my_table" on a database diagram, creating an offline table
(3) in SQL*Plus, "drop table my_table;"
(4) in SQL*Plus, "create table my_table(col_a varchar2(10));"
(5) in JDeveloper on the offline table node "MY_TABLE", select "Refresh from [myConnection]" from the context menu
(6) answer the question "Refreshing the object from myConnection will lose any changes made offline. Are you sure?" with "Yes"
(7) nothing happens, the offline table "MY_TABLE" still has colums "COL_A" and "COL_B"
What am I missing here, why isn't the offline table "refreshed" after step (6)?
many thanks
Jan VerveckenJan,
Using JDeveloper 10.1.3.2 , I'm getting this error:
Stack trace:
java.lang.NullPointerException
at oracle.javatools.db.ora.BaseOracleDatabase.listObjectsImpl(BaseOracleDatabase.java:1336)
at oracle.javatools.db.AbstractDBObjectProvider.createSchemaObjectImpl(AbstractDBObjectProvider.java:678)
at oracle.javatools.db.AbstractDBObjectProvider.getObjectImpl(AbstractDBObjectProvider.java:600)
at oracle.javatools.db.AbstractDatabase.getObjectImpl(AbstractDatabase.java:588)
at oracle.javatools.db.AbstractDBObjectProvider.getObject(AbstractDBObjectProvider.java:814)
at oracle.jdeveloper.cm.dt.ui.SchemaObjectDescriptor.unwrapDescriptor(SchemaObjectDescriptor.java:220)
at oracle.jdeveloper.offlinedb.handler.AbstractTransferHandler.getSchemaObject(AbstractTransferHandler.java:1414)
at oracle.jdeveloper.offlinedb.handler.AbstractTransferHandler.getSchemaObjects(AbstractTransferHandler.java:1397)
at oracle.jdeveloper.offlinedb.handler.AbstractTransferHandler.copyObjectsForTransfer(AbstractTransferHandler.java:951)
at oracle.jdeveloper.offlinedb.handler.ImportHandler.copyObjectsForTransfer(ImportHandler.java:1030)
at oracle.jdeveloper.offlinedb.handler.AbstractTransferHandler.run(AbstractTransferHandler.java:353)
at oracle.ide.dialogs.ProgressBar.run(ProgressBar.java:551)
at java.lang.Thread.run(Thread.java:595)
Regards,
Koen -
Hi All,
I am using JDeveloper 11.1.1.6.0.
I have created my ADF page having panelTabbed component (it is not a template). in each tabs, i have dragged other task flows as regions. I have a outputText on the page, which should be refreshing from a button click inside the region.
How do i achieve that?
below is the code
<?xml version='1.0' encoding='windows-1252'?>
<jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1"
xmlns:af="http://xmlns.oracle.com/adf/faces/rich"
xmlns:f="http://java.sun.com/jsf/core">
<af:pageTemplate viewId="/OverviewTemplate.jspx" id="OverviewTemplate">
<f:facet name="userOverview">
<af:panelStretchLayout id="psl1" styleClass="AFStretchWidth">
<f:facet name="center">
<af:panelTabbed id="pannelTabbedOverview"
binding="#{userBean.panelTabbed}">
<af:showDetailItem text="Overview" id="showDetailItem1"
stretchChildren="first"
binding="#{userBean.overviewTab}">
<af:panelStretchLayout id="overviewtabPSL" topHeight="25px">
<f:facet name="center">
<af:region value="#{bindings.overviewtaskflow1.regionModel}" id="r1"/>
</f:facet>
</af:panelStretchLayout>
</af:showDetailItem>
</af:panelTabbed>
</f:facet>
<f:facet name="top">
<af:panelBox text="PanelBox1" id="pt_pb1" styleClass="AFStretchWidth">
<af:outputFormatted value="#{bindings.userStatus.inputValue}" id="pt_of1"/>
</af:panelBox>
</f:facet>
</af:panelStretchLayout>
</f:facet>
</af:pageTemplate>
</jsp:root>Here i want to refresh the userStatus value from a button which is inside overflowtaskflow1 region. Please help me out.
regards,
Rajanset partial triggers to your outputFormatted component as similar to below example...
partialTriggers="r1:<btn-id>"Since some of the ADF components are naming-container (while generating IDs for child, their presence is considered by appending ID) ... so select r1 from Jdev's partial trigger selection window...and rest of the string you can get from the jsff page in your TF and append both with ":"
1. Say Jdev shows r1 reference as ":::r1"
2. Your btn reference inside TF from top component is "pt1:t1:btn1"
so your partial trigger would become 1+":"+2 = ":::r1:pt1:t1:btn1"
This is just an example, you might have to compute your strings for ppr. -
Transfer documents from on-premise SP 13 to online office 365
My company has been testing the waters in on-premise SP 2013 and recently thought about saving some money and putting moving over to Office 365.
Our concern is the ease of transferring all documents (1000+), especially nested folders, from on-premise to online.i think your best viable solution is using the 3rd party tools, their is no MSFT supported way to migrate the data from on-prem to online. tools such as shareGate, MetaVis
Migrator etc.
Another thought, did you try to save the libraru as template with include content and then upload that template online
site and create a new library with that template. I never tested this just thought about it.
Please remember to mark your question as answered &Vote helpful,if this solves/helps your problem. ****************************************************************************************** Thanks -WS MCITP(SharePoint 2010, 2013) Blog: http://wscheema.com/blog
Maybe you are looking for
-
Whenever I put my earphones into my iPod Touch 2g the music isn't clear and I have to push and hold them in order to hear the song normally. It isn't the earphones because I tried another pair, that worked fine, and there was still no difference. I a
-
Questions about SPML Web Service ( OIM 9.1 )
I need to launch a provisioning process on a target by using a generic connector (Web Service). Scenario: ========== Host A is where the OIM server is installed. A generic connector is defined here. Host B is the target where users must be created. A
-
New to this, doing without reading directions how to use this forum!
Question : Can I download adobe reader to my nook hd + and why am I having trouble when I try to use adobe flash player with internet explorer?
-
Failed Mavericks Install / Time Machine
My iMac is stuck in a loop after failed Mavericks install. The message told me to back up data, erase and install again. Naturally, I hadn't run Time Machine in a month and had some important files to try and save (stupid, I know). Fortunately, I
-
Export Data Sets as jpgs?
Is there a way to export data sets as jpgs instead of psd? The whole point is fast file creation and having a folder full of new psds which I then have to convert to jpg is kinda a silly step. Am I missing something?