Does Visio's Automatic Refresh Pull from the Data Source Automatically
Hey Guys,
So, I usually link my Visio to a Macro-Enabled Excel that pulls from my company's Database. If I select the "Automatic Refresh" Box, does it automatically run the Data Source (Alt + F8). Or do I still need to perform that function manually
in order for the Auto Refresh to work.
Because its macro enabled and needs to pull data from our Database, I have to Run the Excel in order for the new data to refresh into my Visio Charts.
Thanks
also, the guidance from the visio group is to connect directly to your data source, BUT also to have as much data manipulation and calculation done beforehand (macros/views/sprocs/etc) because while visio can do a lot of calculations, it is only meant to
be the presentation later and isn't as efficient.
check out chris Hopkins video here:
http://channel9.msdn.com/Events/TechEd/NorthAmerica/2014/OFC-B316#fbid=
the section on data source best practices (as well as everything else) should be veryhelpful
Christopher Webb | Microsoft Certified Master: SharePoint 2010 | Microsoft Certified Solutions Master: SharePoint Charter | Microsoft Certified Trainer| http://tealsk12.org Volunteer Teacher | http://christophermichaelwebb.com
Similar Messages
-
Data deleted in ODS, data coming from the data source : 2LIS_11_VASCL
Friends,
I have situation :
Data deleted in ODS, data coming from the data source : 2LIS_11_VASCL.
when for the above ods and the data source when trying to delete the request whole data got delted.
All the data got deleted in the fore ground. no background job has been generated for this.
I ma really worried abt this issues. can u please tell me what should be the possibilities for this issue.
Many Thanks
VSMHi,
I suppose you want to know the possibilitiy of getting the data.
If the entire data is being deleted, you can reload the data from source system.
Load the setup table for your application. Then carry out init request.
Please note that you would have to take a transaction-free period for carrying out this activity so that no data is missed.
Once this is done, delta queues will again start filling up. -
Data Extraction issue from the data source 2LIS_08TRTLP
Hi All,
We are facing one issue in time of extraction from data source 2LIS_08TRTLP(Shipment Delivery Item Data per Stage) in BW. For some out bound delivery no. Gross weight(BRGEW) is coming zero though there is value showing in the source system(ECC) in VL03 and in LIPS table. BRGEW (Gross weight) which is directly mapped in BW infosource. There is no restriction in the extraction and 0 is coming for this field in the first layer extraction.Even in the PSA all values are zero.
We have checked those delivery , delivery is being split into batch . But not the all split delivery is
giving this problem.
With Thanks
ShatadruYes I have done the same means I have filled setup table for those shipment no related to that delivery no and checked in the RSA 3 and required value is coming in RSA3 a. That means there is no problem in the Extractor .But I can not pull the data now in BW as it is in Production and every day delta comes .But the in the first layer ODS there is no value for that entry which was previously loaded(Though there is no routine and any restriction).
But I have one observation in the data source that particular field is in INVERSION marked in the Source. But that particular delivery for which it is giving is not canceled or returned or rejected. PGI created for that delivery and delivery status is completed for that. -
Hi all.
I think that the problem I want to discuss is well-known, but still I got no answer whatever I tried ...
I installed the BIEE on Linux (32 bit, OEL 5 - to be more precise), the complete installation was not a big deal. After that I installed the Administration tool on my laptop and created the repository. So... my tnsnames.ora on the laptop looks like this:
TESTDB =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.5)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = testdb)
And the tnsnames.ora on server, in its turn, looks like this:
TESTDB =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = localhost.localdomain)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = testdb.localdomain)
The database worked normally and I created and transferred the repository to the server and started it up.
It started without any errors, but when I tried to fetch the data via the representation services I got the error:
Odbc driver returned an error (SQLExecDirectW).
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred.
[nQSError: 16023] The ODBC function has returned an error. The database may not be available, or the network may be down. (HY000)
I discovered, that the ODBC on my laptop was named not correctly (it should have been identical to tnsnames entry) - so I corrected it, saved and replaced the repository on the server and restarted it... - and still got the same error.
Apparently, something is wrong with the data source. So let me put here some more information...
My user.sh looks like this:
ORACLE_HOME=/u01/app/ora/product/11.2.0/dbhome_1
export ORACLE_HOME
TNS_ADMIN=$ORACLE_HOME/network/admin
export TNS_ADMIN
PATH=$ORACLE_HOME/bin:/opt/bin:$PATH
export PATH
LD_LIBRARY_PATH=$ORACLE_HOME/lib:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH
and my odbc.ini looks like this:
[ODBC]
Trace=0
TraceFile=odbctrace.out
TraceDll=/u01/OracleBI/odbc/lib/odbctrac.so
InstallDir=/u01/OracleBI/odbc
UseCursorLib=0
IANAAppCodePage=4
[ODBC Data Sources]
AnalyticsWeb=Oracle BI Server
Cluster=Oracle BI Server
SSL_Sample=Oracle BI Server
TESTDB=Oracle BI Server
[TESTDB]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=local
Repository=SH
Catalog=
UID=
PWD=
Port=9703
[AnalyticsWeb]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=local
Repository=
Catalog=
UID=
PWD=
Port=9703
[Cluster]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=local
Repository=
FinalTimeOutForContactingCCS=60
InitialTimeOutForContactingPrimaryCCS=5
IsClusteredDSN=Yes
Catalog=SnowFlakeSales
UID=Administrator
PWD=
Port=9703
PrimaryCCS=
PrimaryCCSPort=9706
SecondaryCCS=
SecondaryCCSPort=9706
Regional=No
[SSL_Sample]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=localhost
Repository=
Catalog=SnowflakeSales
UID=
PWD=
Port=9703
SSL=Yes
SSLCertificateFile=/path/to/ssl/certificate.pem
SSLPrivateKeyFile=/path/to/ssl/privatekey.pem
SSLPassphraseFile=/path/to/ssl/passphrase.txt
SSLCipherList=
SSLVerifyPeer=No
SSLCACertificateDir=/path/to/ca/certificate/dir
SSLCACertificateFile=/path/to/ca/certificate/file.pem
SSLTrustedPeerDNs=
SSLCertVerificationDepth=9
Can anybody point a finger where the error line is? According to the documentation it should work fine.Maybe the driver name is wrong? What driver I need then?
Cause I can't find it.
I'm really sorry to bother, guys :) Let me know if you get some ideas about it (metalink didn't help).OK, several things wrong here. First the odbc.ini is not meant to be used for Oracle databases, that's not supported on Linux. On Linux you should OCI (Oracle native drivers) and nothing should be added on odbc.ini. Your user.sh seems to be pointing to your DB installation path. This is not correct. It should point to your Oracle client installation so you need to install the Oracle FULL client somewhere. Typically this is normally done with the same OS account as the one used for OBIEE whereas the DB normally runs with the oracle account. Once you got the client installed test it under the OBIEE account doing tnsping and sqlplus to your DB. Also the LD_LIBRARY_PATH should point to $ORACLE_HOME/lib32 not lib as the lib directory is the 64bits and OBIEE uses the 32bits libraries even in 64bits OSes. Finally change your RPD connection to use OCI. Make all those changes and you should be good.
-
Hi All,
I am reading notepads files and inserting data in sql tables from the notepad-
while performing sql bulk copy on this line it throws exception - "bulkcopy.WriteToServer(dt); -"data type related(mentioned in subject )".
Please go through my logic and tell me what to change to avoid this error -
public void Main()
Dts.TaskResult = (int)ScriptResults.Success;
string[] filePaths = Directory.GetFiles(@"C:\Users\jainruc\Desktop\Sudhanshu\master_db\Archive\test\content_insert\");
for (int k = 0; k < filePaths.Length; k++)
string[] lines = System.IO.File.ReadAllLines(filePaths[k]);
//table name needs to extract after = sign
string[] pathArr = filePaths[0].Split('\\');
string tablename = pathArr[9].Split('.')[0];
DataTable dt = new DataTable(tablename);
|
string[] arrColumns = lines[1].Split(new char[] { '|' });
foreach (string col in arrColumns)
dt.Columns.Add(col);
for (int i = 2; i < lines.Length; i++)
string[] columnsvals = lines[i].Split(new char[] { '|' });
DataRow dr = dt.NewRow();
for (int j = 0; j < columnsvals.Length; j++)
//Console.Write(columnsvals[j]);
if (string.IsNullOrEmpty(columnsvals[j]))
dr[j] = DBNull.Value;
else
dr[j] = columnsvals[j];
dt.Rows.Add(dr);
SqlConnection conn = new SqlConnection();
conn.ConnectionString = "Data Source=UI3DATS009X;" + "Initial Catalog=BHI_CSP_DB;" + "User Id=sa;" + "Password=3pp$erv1ce$4";
conn.Open();
SqlBulkCopy bulkcopy = new SqlBulkCopy(conn);
bulkcopy.DestinationTableName = dt.TableName;
bulkcopy.WriteToServer(dt);
conn.Close();
Issue 1:-
I am reading notepad: getting all column and values in my data table now while inserting for date and time or integer field i need to do explicit conversion how to write for specific column before bulkcopy.WriteToServer(dt);
Issue 2:- Notepad does not contains all columns nor in specific sequence in that case i can add few column ehich i am doing now but the issue is now data table will add my columns + notepad columns and while inserting how to assign in perticular colums?
sudhanshu sharma Do good and cast it into river :)Hi,
I think you'll have to do an explicit column mapping if they are not in exact sequence in both source and destination.
Have a look at this link:
https://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopycolumnmapping(v=vs.110).aspx
Good Luck!
Kaur.
Please mark as answer if this resolves your issue. -
I created a calculated column "Expiration Date" in SharePoint 2010 with formula, =IF([Contingent Hire]=TRUE,(Created+90),(IF([Contingent Hire]=FALSE," ")))
This works in the SharePoint list but when I go to edit the Edit Form in Designer. I get the error specified in the title of this post. I'm trying to make it so the Expiration Date is blank when another column, "Contingent Hire" (a YES/No
column) is FALSE.
The Edit Form is essentially a DataViewWebpart. If I remove the " ", like so, "(IF([Contingent Hire]=FALSE,))" from the calc column, the error goes away; however, the Expiration Date field does not
remain blank like I want it to.
Does anyone have any suggestions? (Below is the error generated when I open Designer and then try to open the Edit form for the corresponding list containing the calc column)
JackSki123Hi Jack,
Could you provide a screenshot about this issue?
As Dimitri suggested, you can install the update for your SharePoint Designer and check again.
And you can also check if you can display "NA" instead of " " in your calculated column per the following post.
http://rajeshspillai.blogspot.in/2012/03/server-returned-non-specific-error-when.html
Thanks
Daniel Yang
TechNet Community Support -
Error: unable to retrieve column information from the data source
Hi,
I am trying to set up a data flow task. The source is "SQL Command" which is
a stored procedure. The proc has a few temp tables that it outputs the final
resultset from. When I hit preview in the ole db source editor, I see the
right output. When I select the "Columns" tab on the right, the "Available
External Column List" is empty. Why don't the column names appear? What is
the work around to get the column mappings to work b/w source and
destination in this scenario.
In DTS previously, you could "fool" the package by first compiling the
stored procedure with hardcoded column names and dummy values, creating and
saving the package and finally changing the procedure back to the actual
output. As long as the columns remained the same, all would work.
Thats not working for me in SSIS.
Thanks in advance.
Asim.I had a similar problem retrieving column information when calling a stored procedure which returned a table declared as a variable within the procedure. I used "Command SQL" as data access within the OLE DB Source Editor, and "exec <sproc_name> <param_list>" as text for the SQL command.
After a few trials and errors, I noticed when I removed the IF statement that validated the parameter passed to the sproc it worked, i.e. SSIS could retrieve the column information, otherwise it wouldn't. Is it because of the IF, the BEGIN/END, or the RETURN? I do not know as I did not test further, but by removing the entire instruction block pasted below, it worked. Hope that it helps.
if @Reel_Ou_Budgete not in ('B', 'R')
begin
print 'Le paramètre de cette procédure stockée doit être B ou R.'
return -1
end -
Where is the data source configuration in Jdev 11g?
Does anybody know where is the data source configuration in JDeveloper 11g? I want to create a JNDI data source using the database connection. Do I have to login the Integrated WLS to create the data source? Thanks.
Hi Chris,
Configuring the data source in Integrated WLS works for me. Thanks for your help!
Looks 11g didn't create the data source automatically in the App Navigator. I checked the deployed application directory - "Application Data\JDeveloper\system11.1.1.2.36.55.36\o.j2ee\drs\TestDS\adf\META-INF" and "Application Data\JDeveloper\system11.1.1.2.36.55.36\o.j2ee\drs\TestDS\META-INF". I didn't find the data source file - data-sources.xml. Then I started the web application and checked the Context, I can only find the data source which I defined in the Integrated WLS. It is interesting that 11g removed the JNDI list from the RMI bindings.
Also thanks to Nick for the information. We are migrating a Spring Framework application to Oracle and don't use ADF Business Components in this project.
Huaichen -
Problem with the data source and web.xml
I have an issue where JSC is removing my resource reference:
<resource-ref>
<description>Creator generated DataSource Reference</description>
<res-ref-name>jdbc/localOracleDatabase</res-ref-name>
<res-type>javax.sql.DataSource</res-type>
<res-auth>Container</res-auth>
</resource-ref>
from the web.xml and sun-web.xml.
The application has been working great in the IDE for months then wham, no more data source definition. I try and add the reference manually and the IDE takes it out. I am "NOT" adding it to the .xml's in the build area. Why is JSC removing the data source entry?This continues to be a problem. The only way that I can get around the problem is to drag a table from the data source onto the design pallete and then the datasource is added back to the web.xml. I can run fine for 10 or 15 runs then the entry is once again removed from the web.xml.
Help please! -
hi guru's,
Iam having a scenario that I have to load the data from 3 different Data sources into single Z-Cube my question to you is what are the advantages and disadvantages for this scenario.
regards
lakshmiHi,
The scenario is common and only thing is
1) How are you managing the loads,records are not getting doubled ...as cube has no overwriting facilyt and every records sits on the top of other one...doesnt amtter if they are same or different.
So you have to make sure that you are getting accurate information from the data source.
2)Delta from the data source should be accurate and should be in accordane with the struture of the infocube.
Not all the kinds of deltas can be loaded into the cube and has to be loaded through DSO.
3)You have to take care of delta management issues as well in case of data load failures as well which becomes tough to manage if there is no DSo in between.
I would always advice to go for any DSO in betweem if you are loading to the cube.
Thanks
Ajeet -
How to findout the data source information for the perticular tables
Hi,
Can you please tell me the process to findout the data source information for the perticular tables .
For ex.. T2503 ,T2507 ,T25A1,T25A2 etc ..
I am doing a reverse engineering to find out the data sources build on the above SAP Tables.
Thanks.Hi,
Still we haven't get the field level information ,before they send us we should first give them the corresponding data sources for the COPA tables which they have given .
I have searched in help.sap.com but I didn't find any information on this .
Please let me know is there any way to know the data source details for SAP Tables.
Thanks -
Info cube key figure quantity is more than the data source
Hi,
I am loading data from a custom table data source into an info cube. The data records and key figure quantity of the info cube is more than the data source custom table.
I checked the extraction in RSA3. The number of records records and key figure quantity are exactly extracted from the data source in RSA3 transaction.
But after loading using the info package, the number of records and the key figure quantity of the info cube is more than the actual records available in the data source. The update rule is simple 1 to 1 mapping.
Kindly help to troubleshoot this issue.
thanks and regards
MurugesanHi,
There could be some other data already avilable in the Infocube. Pls take the Request ID or Number from the last run of the Infopackage you used and check the records only for that Request ID in Infocube.
You can check the data in the LISTCUBE Transaction.
Please let me know of you need additional information.
Thanks,
Jeysraj -
Data in PSA content is different than the data source
Dear Experts,
I am using Process chain to update info cube from data source of a z table. In the process chain, the PSA request request step is having the PSA table number of the correct data source. The info package and the DTP also have the correct data source.
But after executing the process chain, the info cube is not updated with the required data source. It is updated with data from some other data source which is belonging to a different custom table.
I have checked the Extraction of the data source in RSA3. This is extracting from correct data source. But when I check PSA contents through process monitor in process chain log for DTP, I found the PSA content data is different from the data source.
I am totally confused, how this is possible. Can anybody advise me to troubleshoot and fix this issue.
Thanks and regards
MurugesanDear Igor,
The PSA request is being deleted in a previous step prior to loading the info package. I can see every time when I run the process chain, the old request number is deleted and a new request number is created.
Still the same issue is persisting.
I also observed one more thing that the second time when we repeat the same process chain, the correct data source data is loaded in the info cube and PSA table. Only during the first time, the PSA table and Info cube it is not updating with the right data.
thanks and regards
Murugesan -
Does anyone know how to use pages so you can export pdfs from the internet and automatically drag words from the document into the file name of the pdf (i.e., author, title of a scientific paper). For example, if I am downloading a paper by smith called "Surgery" that was published in 2002, it will automatically set the file name in the download to smith- surgery 2002. I have heard pages is smart enough to do this.
thank youPages can export only its own documents. They may be exported as PDF, MS Word, RTF or Text files.
Pages can import (ie. Open a file as, or Insert a file into, a Pages document) documents in several formats, but won't rename the document as you describe. Documents that can be Opened (eg. Text, AppleWorks 6 WP, MS Word files) are converted to Pages documents, and retain their original names, with .pages replacing the original file extension. Files that can be Inserted (generally .jpg, .pdf and other image files) become part of the existing Pages file and lose their names.
It may be possible, using AppleScript, to extract the text you want and to Save a Pages file using that text as the filename, but that would depend in part on being able to identify which text is wanted and which is not.
How will the script determine where the author's name begins and where it ends?
How will the script recognize the beginning and of the title, an decide how much of the title to use in the filename?
How will the script recognize the year of publication?
For papers published in a specific journal, with a strict format for placing each of these pieces on information, or containing the needed information as searchable meta data in the file, this might be possible. But it would require knowledge of the structure of these files, and would probably handle only papers published in a specific journal or set of journals.
Outside my field of knowledge, but there are some talented scripters around here who might want to take a closer look.
Best of luck.
Regards,
Barry -
Powerpivot Error on Refresh -- "We couldn't get data from the data model..."
I'm using Excel 2013 and Windows 8.1. I have a spreadsheet I've been using for over a year, and I've just started getting this error message when I try to refresh the data.
"We couldn't get data from the Data Model. Here's the error message we got:
The 'attributeRelationship' with 'AttributeID' - 'PuttDistCat9' doesn't exist in the collection"
Any idea how I can fix this problem? I haven't changed anything related to that particular attribute. All the data is contained in separate sheets in the workbook, so there are no external sources of data.
Thanks.
JeanThanks for all the suggestions.
I found a slightly older version of the spreadsheet that still refreshes properly, so I don't think I have any issues with the version of Excel or Power Query. (I've had this same error before, and I believe I applied the hotfix at that time.)
I think this problem started after I updated a number of the date filters in the pivot tables. I haven't made any changes to the data model, and the only updates I've made were to add data (which I do all the time), and to change the date filters on
the pivot tables.
As suggested, I added a new pivot table querying one table (the table with the attribute that shows up in the error message), and it worked fine. I can also refresh this pivot table.
Then I tried adding a pivot table which went against several tables in the data model (including the table in question). The pivot table seemed to return that data properly. However, when I tried to refresh it, I got the same error message ("we
couldn't get data from the data model...").
Dany also suggested running the queries one at a time to see which one is in error. Without checking all the pivot tables, it appears that any which use the table "HolePlayedStrokes" generate the error (this is the table with the attribute
mentioned in the error message). Pivot Tables without that particular table seem to refresh OK. Unfortunately, that is the main table in my data model, so most of the pivot tables use it.
Any other suggestions? I'd be happy to send a copy of the spreadsheet.
Thanks for all the help.
Jean
Maybe you are looking for
-
How Can I Keep Music on an External Drive?
I've dealt with this issue before and asked questions about it, but I STILL don't understand how iTunes works when it comes to its "Library" "Media Folder" etc. Can anyone break it down for me exactly how it all works? Is there a good resource to exp
-
I downloaded an album from iTunes on my iPad and then i synced it with iTunes it doesn't show the individual songs in my library just the album which opens another window. One of the albums won't even play the music. Will only open the "booklet" fo
-
InDesign CS5 Not Printing properly...tight timeline
Hi I've designed a series of 8.5 x 11 inserts for a Fireplace manufacturer and I'm having a heck of a time printing. I've set up the individual pages at 8.5" x 11" with the boxes in the left top corner reading X=0, Y=5.5, W=8.5 in, H=11 in. When I
-
HT1386 can hear phone ringing, screen is blank, cannot access the iphone
help! my phone is ringing but screen is blank- cant access phone... any ideas?
-
Delete Reverb and Delay Aux Channels
I want to save all the processor power I can, so I want to completely delete the default reverb and delay bussed aux channels that come up automatically when you start a new concert. When I select them, Delete is gray-ed out under the edit tab. How c