Dynamic Data Source with Flash file

I have a MySQL database which I am using to create a dynamic
table for a shopping website. One of the fields of the database is
for a video of each product. In the videos directory is a bunch of
flash movies (.swf).
In Dreamweaver I have a dynamic table that is using the data
from my MySQL database. When I preview the site all the dynamic
data works fine and appears correctly. However the video does not
work when I preview the site. The steps I made to create the dyamic
video link in dreamweaver are as follows:
Insert>Media>Flash
Select File Name From: Data Sources
Field: Recordset>Video
I've checked the information in my MySQL database and the
data there seems to be correct. If I remove the dyamic video
information and manually place the necessary flash video and
preview the site it works fine.
So why won't my flash videos work dynamically?

If it is open, then try
Named range: ConsolLink = Range("Consol File.xlsm'!Source_OpexConsol")
And if it is not currently open, then try
Dim W As Workbook
Set W = Workbooks.Open("R:\Root\Sub1\Sub2\Consol
File.xlsm")
Named range: ConsolLink =
Range("Consol File.xlsm'!Source_OpexConsol")
W.Close False

Similar Messages

  • Using an Excel file as a dynamic data source? Opinions needed...

    I have posted this topic before, but as always; in order to get the relevant correct answer you have to ask the correct question.
    I'm trying to create a number of Pricelists linked to an Excel/CSV file. I have a Excel file that contains Pricelist information which is Product specific.
    I have had a number of suggestion that follow:
    A direct link to the Excel file. PROs: Excel file can be uploaded on FTP and overwritten if (and when) amended. Linking this is easy peasy in Dreamweaver. Person browsing can download info straight away on request - no hassle. CONs: Simply, not everyone has Excel and those who don't can not access the information.
    Import Excel file as tabular data. PROs: Fairly easy to do in Dreamweaver. Person browsing can see info straight away. CONs: Can be time consuming on larger Excel files. NOT amendable (so a number of price changes becomes a big job). Can't simply overwrite Excel file on FTP. Larger Excel files can take a lot of page space and thus require tonnes of scrolling).
    Use the Excel file as a dynamic data source. --Not entirely sure how I would go about doing this (any suggestions/links/tutorials etc)-- PROs: ? CONs: Contributor added this suggestion is not a good idea as it performs poorly on the web.
    Create a dynamic page using a database and import the Excel file to that....or maintain the price list in the database rather than an Excel file. --Again not entirely sure how I would go about doing this (any suggestions/links/tutorials etc)-- My understanding of this option is that it will require XML data and SPRY work. Is this correct? Can this be someone who is not an advanced user?
    If once again, I'm off the mark and better suggestions can be thought - please do so.
    As you can see I'm at a bit of a crossroads so an suggestions, comment, help, links, tutorials or applause would be greatly received.
    Thanks in advance and looking forward to seeing the comments this throws up!

    Hi
    Although not everyone has excel just about everyone can open csv files in some way, if not offer the option to download "open office" which is free for the pc, as for the mac they have a program installed by default to use csv files.
    The import tabular data is not really an option for the reasons stated.
    Use excel as data source - not a good idea, requires asp.net to work correctly otherwise it does run slow and is not recommended if you are expecting more than a very small number of users.
    As for the dynamic with database, this can be done with xml and spry but if you have a large amount of data this is almost as bad as the option above. You are probably better creating a database and importing your excel spreadsheet into this, for tutorials on creating a php/mysql database and set-up of testing server see - http://www.adobe.com/devnet/dreamweaver/application_development.html.
    No matter which way you go with the last option it will require a fair amount of knowledge and experience to do correctly, efficiently and securely.
    PZ
    www.pziecina.com

  • An error occurred querying a data source - with REST services

    Hi,
    I have a SharePoint 2013 form library library with an info-path form. I need to get the logged in user's 'Display Name' on my form load automatically.
    I used REST service to fetch the current user details. In the preview mode of the form, its showing the right name. But when I publish this form to library I am getting the following error.
    REST Service --> http://site url/_api/SP.UserProfiles.PeopleManager/GetMyProperties
    Please help me to resolve this issue.
    Thanks in advance for your time and reply :)

    Hi,
    According to your post, my understanding is that an error occurred querying a data source with REST services.
    It is defiantly permission issue with GetUserProfileByName service
    and could be many reasons of this problem. You first try with UDCX file and make sure that UPS is running.
    Here are some similar threads for your reference:
    http://social.technet.microsoft.com/Forums/en-US/b8c668ea-7511-4657-a1a8-08fb4a6bd53d/info-path-an-error-occurred-querying-a-data-source?forum=sharepointcustomizationprevious
    http://social.technet.microsoft.com/Forums/en-US/46866ac2-da09-4340-a86a-af72cbb2c8d7/info-path-an-error-occurred-querying-a-data-source-?forum=sharepointcustomization
    http://blogs.msdn.com/b/russmax/archive/2012/08/17/want-to-call-sharepoint-2010-web-services-within-browser-based-infopath-2010-forms.aspx
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • Dynamic data source in Excel Pivot Table

    Hello there,
    I am trying to have dynamic data source in pivot table using INDIRECT but getting the error "Reference not valid". Here is how I setup the reference:
    Named range: ConsolLink = "'R:\Root\Sub1\Sub2\Consol File.xlsm'!Source_OpexConsol"
    "Source_OpexConsol" is defined in the source file as a dynamic name using offset formula.
    In the pivot data source, I have tried "=INDIRECT(ConsolLink)" as the data source but that does not work.
    I have also tried using INDIRECT in ConsolLink and just referencing "ConsolLink" as the data source. That does not work either.
    I am not using Power Pivot. Appreciate it if someone can help.
    Thanks.

    If it is open, then try
    Named range: ConsolLink = Range("Consol File.xlsm'!Source_OpexConsol")
    And if it is not currently open, then try
    Dim W As Workbook
    Set W = Workbooks.Open("R:\Root\Sub1\Sub2\Consol
    File.xlsm")
    Named range: ConsolLink =
    Range("Consol File.xlsm'!Source_OpexConsol")
    W.Close False

  • Using JDBC Data Sources with ADFBC, NoInitialContextException

    Using JDBC Data Sources with ADF Business Components, NoInitialContextException
    I follow the instruction in the link below to create an ADF Swing application using datasource. I am using JDeveloper version 10.1.3.
    http://www.oracle.com/technology/products/jdev/howtos/10g/usingdatasources/using_datasources.html
    The ADF generated code looks like this:
    JUMetaObjectManager.setErrorHandler(new JUErrorHandlerDlg());
    JUMetaObjectManager mgr = JUMetaObjectManager.getJUMom();
    mgr.setJClientDefFactory(null);
    BindingContext ctx = new BindingContext();
    ctx.put(DataControlFactory.APP_PARAM_ENV_INFO, new JUEnvInfoProvider());
    ctx.setLocaleContext(new DefLocaleContext(null));
    HashMap map = new HashMap(4);
    map.put(DataControlFactory.APP_PARAMS_BINDING_CONTEXT, ctx);
    mgr.loadCpx("datasource.view.DataBindings.cpx" , map);
    final FormMain frame = new FormMain();
    frame.setBindingContext(ctx);
    I got this error when executing the last line: frame.setBindingContext(ctx);
    (oracle.jbo.common.ampool.ApplicationPoolException) JBO-30003: The application pool (datasource.datamodel.AppModuleDS) failed to checkout an application module due to the following exception:
    ----- LEVEL 1: DETAIL 0 -----
    (oracle.jbo.JboException) JBO-29000: Unexpected exception caught: oracle.jbo.DMLException, msg=JBO-27200: JNDI failure. Unable to lookup Data Source at context jdbc/xe_hrDS
    ----- LEVEL 2: DETAIL 0 -----
    (oracle.jbo.DMLException) JBO-27200: JNDI failure. Unable to lookup Data Source at context jdbc/xe_hrDS
    ----- LEVEL 3: DETAIL 0 -----
    (javax.naming.NoInitialContextException) Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial
    If I configure the application module connection type as JDBC URL, everything works.
    If the connection type is JDBC Datasource, I got the above error.
    Can someone show me how to adjust the generated code by ADF to use datasource?

    ADF BC has a bug. With Data Source in Application Module, application module does not connect. Use JDBC Connection URL.
    Also refer
    ADF BC: JDBC URL vs JDBC DataSource

  • White screen with flashing file icon with question mark mean

    What does this mean.   We are getting a white screen with flashing file icon with question mark inside the file. 

    It almost certainly means it can't find a bootable volume (one with OS X on it).
    Which means most likely you will have to find your original system installation DVD and install it. Then you should figure out what caused the problem.
    Read this to diagnose it before resorting to reinstalling the OS:
    http://support.apple.com/kb/TS1440
    Excerpt:
    Additional steps
    If your Mac still starts to a flashing question mark, follow the steps below. If any step resolves the issue, you don't need to continue to the next one.
    Select your Mac OS X startup disk with Startup Manager by restarting and holding the Option key. After your Mac starts up, restart again to verify that the flashing question mark does not appear.
    If the issue persists, insert your Mac OS X installation disc. Be sure to either use the disc that came with your Mac, or, if you installed a later Mac OS X version from disc, use the newer disc.
    MacBook Air note: On a MacBook Air, there are two options for starting up from Mac OS X media: Either connect a MacBook Air SuperDrive to the MacBook Air via the USB port and restart the computer, holding down the C key during startup, or use Remote Install Mac OS X to startup from a system software DVD that's located on a partner computer. Once started up from Mac OS X media, skip to step 3.
    Restart the computer, then hold the C key during startup.
    From the Utilities menu, choose Disk Utility. Don't click Continue.
    Select your Mac OS X disk (named "Macintosh HD" by default) in the left side of the Disk Utility window.
    Click the First Aid tab.
    Click Repair Disk to verify and repair any issues with your Mac OS X startup disk.
    After repairing the disk, try to start up normally.
    Important: If Disk Utility finds issues it cannot repair, you may need to back up as much of your data as possible (or use Time Machine to back up to a different disk), then erase the disk and reinstall Mac OS X. You should back up important files and data before erasing a drive. Erasing deletes everything on the hard disk (including things on your desktop). Also, you can install Mac OS X onto an external disk, start from the external disk, and use Migration Assistant to transfer items from your usual internal Mac OS X startup disk to the external disk, then erase the internal disk and reinstall Mac OS X.
    If the issue persists, and Disk Utility didn't find any irreparable issues, quit Disk Utility, quit the Installer, select your disk when prompted, and restart.
    If the issue continues, reset PRAM. Note: After resetting PRAM, if the computer starts up normally, reselect the startup disk in the Startup Disk preferences.
    If none of these steps resolve the issue, start up from the Mac OS X Installation disc and reinstall Mac OS X.

  • When to call DSDisposeHandle when you have a DLL function acting as a repeating DS dynamic data source?

    Hi 
    I have a DLL function acting as a dynamic data source within a LabVIEW application (see attachment) - I am using DSNewHandle to dynamically allocate 2D array handle storage via the LV memory manager for the arbitrary size data arrays to be passed into the application. I had assumed that these memory blocks would (magically) be disposed when appropriate by the LabVIEW built in processing blocks that sit downstream, however this is not the case and the LabVIEW system memory usage keeps increasing until LabVIEW environment is exited (not simply if just the offending application is stopped or closed).
    So my question is how and where to mop up the used up array buffers in the application processing chain and how do I know when the buffers are actually exhausted and not being reused downstream (for instance the two 2D arrays are first combined into a complex 2D array by the re+im to complex labview operator - is the data memory out of this stage totally different from the inputs or is it some modified version of the input arrays - if totally different and the input 2D arrays are not wired into any other blocks why does the operator not dispose the input data stores?)
    Perhaps Im going about this the wrong way having the DLL data source dynamically allocate the data space arrays? Any advice would be gratefully received
    Regards
    Steve
    Solved!
    Go to Solution.
    Attachments:
    DataGetFloat.PNG ‏23 KB

    Then instead of doing:
    DLLEXPORT int32_t DataGetFloatDll(... , Array2DFloat ***p_samples_2d_i, ...)
    if ( p_samples_2d_i )
    // *p_samples_2d_i can be non NULL, because of performance optimization where LabVIEW will pass in the same handle
    // that you returned in a previous call from this function, unless some other LabVIEW diagram took ownership of the handle.
    // Your C code can't really take ownership of the handle, it owns the handle for the duration of the function call and either
    // has to pass it back or deallocate it (and if you deallocate it you better NULL out the handle before returning from the
    // function or return a different newly allocated handle. A NULL handle for an array is valid and treated as empty array.
    *p_samples_2d_i = (Array2DFloat **) DSNewHandle( ( sizeof(int32_t) * 2 ) + ( sizeof(float) * channel_count * sample_count ) );
    // Generally you should first try to insert the data into the array before adjusting the size
    // the most safe would be to adjust the size after filling in the data if the array gets bigger in respect to the passed in array
    // and do the opposite if the adjusted handle happened to get smaller. This is only really important though if your C code can
    // bail out of the code path because of error conditions between adjusting the handle size and adjusting the array sizes.
    // You should definitely avoid to return from this function with the array dimensions indicating a bigger size than what the
    // handle really is allocated for, which can happen if the array was resized to a smaller size and you then return because of errors
    // before adjusting the dimension sizes in the array.
    (**p_samples_2d_i)->Rows = channel_count;
    (**p_samples_2d_i)->Columns = sample_count;
     you should be doing:
    DLLEXPORT int32_t DataGetFloatDll(... , Array2DFloat ***p_samples_2d_i, ...)
    MgErr err = NumericArrayResize(fS /* array of singles */, 2 /* number of dims */, (UHandle*)p_samples_2d_i, channel_count * sample_count);
    if (!err)
    // Fill in the data somehow
    // Adjust the dimension sizes
    (**p_samples_2d_i)->Rows = channel_count;
    (**p_samples_2d_i)->Columns = sample_count;
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Not able to create a Data source with qty filed in the table

    Hi i am having scenario.
        i am doing a Genric R/3 extraction for R/3 system. I created a view from two tables. I have added a net quantity field also in then view table. But when i created a data source for that view it is showing a error message which says that it cannot create a data source with a quantity field in the table.
    Here is my question. How to create a Data Source for a view table which has Qty filed in it.
    Pls help with this.
    Senthil

    Hi Sensai,
       U need to add reference table for the unit .... when added into the view u create .......the reference table name u can find in the table itself from where it is brought into the view.... in the table currency quantity fields........
    U can add that table in the currency /quantity fields of the view....u need to add referecne field and table...and activate ........
    i think it works
    Regards
    vamsi

  • Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of '

    When I deploy the cube which is sitting on my PC (local) the following 4 errors come up:
    Error 1 The datasource , 'AdventureWorksDW', contains an ImpersonationMode that that is not supported for processing operations.  0 0 
    Error 2 Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW', Name of 'AdventureWorksDW'.  0 0 
    Error 3 Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed.  0 0 
    Error 4 Errors in the OLAP storage engine: An error occurred while the 'Customer Alternate Key' attribute of the 'Customer' dimension from the 'Analysis Services Tutorial' database was being processed.  0 0 

    Sorry hit the wrong button there. That is not entire solution and setting it to default would work when using a single box and not in a distributed application solution. If you are creating the analysis database manually or using the wizard then you can
    set the impersonation to your heart content as long as the right permission has been set on the analysis server.
    In my case I was using MS Project Server 2010 to create the database in the OLAP configuration section. The situation is that the underlying build script has been configured to use the default setting which is the SQL Service account and this account does
    not have permission in Project Server I believe.
    Changing the account to match the Project service account allowed for a successful build \ creation of the database. My verdict is that this is a bug in Project Server because it needs to include the option to choose impersonation when creating the Database
    this way it will not use the default which led to my error in the first place. I do not think there is a one fix for all in relations to this problem it is an environment by environment issue and should be resolved as such. But the idea around fixing it is
    if you are using the SQL Analysis server service account as the account creating the database and cubes then default or service account is fine. If you are using a custom account then set that custom account in the impersonation details after you have granted
    it SQL analysis administrator role. You can remove that role after the DB is created and harden it by creating a role with administrative permissions.
    Hope this helps.

  • Shared Data Source with prompted credentials - errors out.

    Using Data Tools (VS 2010) to build report.
    Have built a simple report while trying to diagnose this issue.  This report is created in a project that was originally built in BIDS2008 and migrated to Data Tools 2010.  This project contains an original shared dataset that does NOT
    error out.
    Report contains
         1) shared data source with properties set to prompted credentials.
         2) shared dataset using data source above
         3) 2 parameters (month and year)
         4) Report uses shared dataset
         5) 2 text boxes in the report body to display the parameters values.
    When I try to preview this report I get the error:
         The execution failed for the shared data set "xxx"
         Cannot create a connection to data source 'Data source for shared dataset'
         Security processing failed with reason "3" ("password missing")
    Now, if I changed that same shared dataset to use a static login and password; the report renders fine.
    What is going on???

    Hi RoseNeedsAVacation,
    When you use shared data source with "Prompt for credential" option, please use SQL Server login credential to u view the report in BIDS or SSDT report designer environment.
    Note: The shared data source will automatic switch to SQL Server authentication mode if we use "Prompt for credential" option.
    After you deploy your report and shared data source to report server, please remember to configure your shared data source to use "Credentials supplied by the user running the report" option. Furthermore, select "Use
    as Windows credentials when connecting to the data source" checkbox if the credentials that the user provides are Windows Authentication credentials. Do not select this check box if you are using database authentication (for example, SQL
    Server Authentication).
    For more information, please refer to the article below:
    New Data Source Page (Report Manager):
    http://technet.microsoft.com/en-us/library/ms180077(v=sql.100).aspx
    Regards,
    Elvis Long
    TechNet Community Support

  • Can we set the dynamic data source when using getReportParameters() ?

    Hello!
    I have a report where one of its parameters refers to a list of values (LOVs). This list of values is an SQL Query type. When the data source used in the report is defined in the BI Publisher server, I'm able to get the report parameters using the getReportParameters() function in my application. However, if the data source is not defined the function throws an exception, which is understandable.
    I decided to dynamically set the data source so that even if the data source used by the report is not defined in the BI Publisher server, it still will be able to get the LOVs for the parameter. I tried setting the JDBCDataSource of the dynamicDataSource for the ReportRequest object that I passed to the getReportParameters() function. Please see the sample code below:
    reportRequest.dynamicDataSource = new BIP10.BIPDataSource();
    reportRequest.dynamicDataSource.JDBCDataSource = new BIP10.JDBCDataSource();
    setReportDataSource(reportRequest.dynamicDataSource.JDBCDataSource, connectstr, jdbc, dc); //function to set the values for JDBCDataSource object
    reportParams = webrs.getReportParameters(reportRequest, uid, pwd); //call the getReportParameters
    I was expecting this to work as this is what I did to dynamically set the data source before calling the runReport function. So, my question is -- can we set the dynamic data source when using getReportParameters() ? I tried this both in versions 10g and 11g. It does not seem to work on both versions.
    Regards,
    Stephanie

    report_id column of apex_application_page_ir_rpt can help us uniquely identify each saved report.
    We can assign this report_id value to a page item and this page item can be put in the Report ID Item text box of the Advanced section of the Report Attributes page of the IR.
    This should load the saved report identified by report_id and you can get rid of javascript
    Regards,
    Vishal
    http://obiee-oracledb.blogspot.com
    http://www.packtpub.com/oracle-apex-4-2-reporting/book
    Kindly mark the reply as helpful/correct if it solves your problem

  • Got a white screen with flashing file icon

    Why do I have a white screen with flashing file icon with question mark inside?

    It means your iMac cannot located a valid system to boot.
    Insert the CD that came with the iMac and allow it to boot. Once it gets the point where it's offering to install the OS, go to the top menu and choose diskutility. Perform a repair on the main drive "Macintosh HD".

  • Generic Data Source with Function Module data mismatch in BI

    Hi All,
    I'm using Generic Data Source with Function Module, When I execute the Function Module (Which I have Created), I'm getting 16000 records and when run extractor(in RSA3) im getting different no.of records(infact they are more no.).
    when I run the InfoPackage in BI im Getting more no. of records than what i got executing the function module..
    and single record is divided into 2 records in BI side(not all the records), how can it be possible???
    is there anything Im missing to explain you my issue???
    if understood please help me out.
    Thanks n Regards,
    ravi.

    the datasource frame work starts the function module several times.
    1. the initialization
    2. the serval times, until you "raise no_more_data".
    check you coding: have you refreshed necessary internal tables.
    Sven

  • Generic Data Source with Function Module data mismatch

    Hi All,
    I'm using Generic Data Source with Function Module, When I execute the Function Module (Which I have Created), I'm getting 16000 records and when run extractor(in RSA3) im getting different no.of records(infact they are more no.).
    when I run the InfoPackage  in BI im Getting more no. of records than what i got executing the function module..
    and single record is divided into 2 records in BI side(not all the records), how can it be possible???
    is there anything Im missing to explain you my issue???
    if understood please help me out.
    Thanks n Regards,
    ravi.

    HI rkiranbi,
    1. FIrst you excute function module according to your paramers, you will get some records. then goto tcode RSA3 --> excute
    Provide your Data source name and under setting we have options like Data records/calls, Display extractor calls and selections --> fields .
    in that options you have to increase the values. and then you have to pass paramers in RSA3 according to your function module
    selections in SE37. Now you will get equal values in both functin module selection and RSA3 Selection. if it fail means  you need to
    check coding logic in function module. 
    2. if your  are getting wrong values in BI System then check with
                  1. compare with PSA data and data target data (here you need to check with characterstic as well as keyfigures)
                  if you find any mistake you need change the coding in function module according to client requirement.
                  2. compare data with RSA3 and bi report data or data target data.
                                 check it properly above steps, you will get solution.
    thanks and regards,
    malli

  • Generic Data source with Infoset Query

    Hi All,
    Please provide me the links or PPT's or Information or PDF's for the creation of the Generic Data source with Infoset Query or
    Please explain in clear steps how to create Generic Data source with Infoset Query.
    Thanks & Regards,
    Ravi

    Oopps...being a BI guy, always think of BW
    I have never created, though I will suggest to follow this:
    A query can be created to extract information from master records i.e Infotypes. For example, by creating a query , the data relating to an employee contained in various Infotypes can be extracted.
    Proceedure :
    Decide on the various Infotypes we want to make the query. Decide on the area where we want to query i.e Global area or Standard area. Standard area is client specific and globel area will include all clients.
    Menu : HR u2013 PM u2013 Admn - Information System - Adhoc Query
    Select area standard and select the user group already created
    Creation of new query :
    TC SQ03 - Select Environment u2013 Select Standard Area - Enter -- If new user group is to be created, enter name of the user group, click on create and enter necessary information and exit after saving
    TC SQ02 - Enter name of the Infoset u2013 Create u2013 enter name of Infoset - Data source -- > Table join by basis table u2013 give name of table e.g pa0000 - Enter - Click on insert table if we want to include more tables u2013 give name of table one by one and after finishing, place cursor on the joining lines and right click to delete unwanted relationships - check - and go back - field groups - include all table fields - click on generate button - go out
    TC SQ03 - Select user group - eg. Payroll
    Infoset - Enter name of newly created Infoset
    Assign users and Infosets - Assign infosets - put tick on payroll - save and go back
    TC PAAH - Expand the nodes and put tick on relevant fields depending upon necessity
    Save the query by giving the same name as infoset for easyness..
    Thanks...
    Shambhu

Maybe you are looking for