Camileo X Sports - SD card performance issue - too slow

Hello,
can anybody help me please.....when i tun on the cam and make the FIRST record it always shows me that the speed card is too slow, but i have sandisk 64gb extreme with 48mbs.....but it only dissapears at the first record and not the following.
what can i do....i have also the newest firmware.

yup! I had the same problem at first, i have the Samy EVO class 10 32G SDHC.
It's usually brilliant, actually it works fine with the 50 fps, 30 fps and even the 720p 120 fps.
However i found the problem easier than most people simply due to sheer coincidence.
I left the screen on while recording and right before i stopped recording (btw it never froze in my case it just stopped recording) and the screen displayed "low speed card"! what do you know!
I researched the issue and it turns out the Samsun EVO was a bit slower than what the camera required at 60 fps. at 50 fps it barely stopped recording once.
SanDisk Micro SDHC Extreme Class 3 32GB SDSDQXN-032G-G46A
This card is a relatively cheap one, it is a UHS class 3 which mean it's an Ultra High Speed card. The class 3 is a great ranking. It's like 10 Euros more expensive than the Samsun and from what i've read works great with the GoPro hero 4 while shooting 4K & 1080p at 120 fps so it should hold up very well.
It was never the upgrade or downgrade of the software. Lucky i left the screen on!

Similar Messages

  • Performance is too slow on SQL Azure box

    Hi,
    Performance is too slow on SQL Azure box (Located in Europe)
    Below query returns 500,000 rows in 18 Min. on SQL Azure box (connected via SSMS, located in India)
    SELECT * FROM TABLE_1
    Whereas, on local server it returns 500,000 rows in (30 sec.)
    SQL Azure configuration:
    Service Tier/Performance Level : Premium/P1
    DTU       : 100
    MAX DB Size : 500GB     
    Max Worker Threads : 200          
    Max Sessions     : 2400
    Benchmark Transaction Rate      : 105 transactions per second
    Predictability : Best
    Any suggestion would be highly appreciated.
    Thanks,

    Hello,
    Can you please explain in a little more detail the scenario you testing? Are you comparing a SQL Database in Europe against a SQL Database in India? Or a SQL Database with a local, on-premise SQL Server installation?
    In case of the first scenario, the roundtrip latency for the connection to the datacenter might play a role. 
    If you are comparing to a local installation, please note that you might be running against completely different hardware specifications and without network delay, resulting in very different results.
    In both cases you can use the below blog post to assess the resource utilization of the SQL Database during the operation:
    http://azure.microsoft.com/blog/2014/09/11/azure-sql-database-introduces-new-near-real-time-performance-metrics/
    If the DB utilizes up to 100% you might have to consider to upgrade to a higher performance level to achieve the throughput you are looking for.
    Thanks,
    Jan 

  • Bumblebee performance is too slow

    Hi everyone,
    This is my second post in this forum and it has been only 2 days that I met Arch. Before, I was using Ubuntu for 3 years, but due to low performance in my PC, unfortunately, I decided to say good bye which was hard to say.
    Now, I am trying to have the same setup as my previous laptop and Bumblebee was one of them. I followed the instructions here: https://wiki.archlinux.org/index.php/Bumblebee.
    It seems that bumblebee is installed and working, however, the FPS is too slow:
    $ optirun glxspheres64 -info
    Polygons in scene: 62464
    Visual ID of window: 0x20
    Context is Direct
    OpenGL Renderer: GeForce GT 520MX/PCIe/SSE2
    0.023848 frames/sec - 0.021114 Mpixels/sec
    Without optirun I get:
    $ glxspheres64 -info
    Polygons in scene: 62464
    Visual ID of window: 0x20
    Context is Direct
    OpenGL Renderer: Mesa DRI Intel(R) Sandybridge Mobile
    0.033237 frames/sec - 0.029426 Mpixels/sec
    0.029968 frames/sec - 0.026533 Mpixels/sec
    This is impossible as I was getting very good results before.
    I am wondering if I did something wrong, or missed anything.
    Just for information, system specs:
    Intel i7 2670QM 2.2 GHZ
    4 GB RAM
    1 GB GeForce GT 520MX
    512 MB Intel Graphics
    I played 0ad with and without optirun and the performance was good in both of them, but not sure if it switches the video cards itself.
    I also have bbswitch installed.
    Any help would be appreciated. Thank you.
    Last edited by wakeup12 (2014-09-27 14:29:19)

    Hello,
    Can you please explain in a little more detail the scenario you testing? Are you comparing a SQL Database in Europe against a SQL Database in India? Or a SQL Database with a local, on-premise SQL Server installation?
    In case of the first scenario, the roundtrip latency for the connection to the datacenter might play a role. 
    If you are comparing to a local installation, please note that you might be running against completely different hardware specifications and without network delay, resulting in very different results.
    In both cases you can use the below blog post to assess the resource utilization of the SQL Database during the operation:
    http://azure.microsoft.com/blog/2014/09/11/azure-sql-database-introduces-new-near-real-time-performance-metrics/
    If the DB utilizes up to 100% you might have to consider to upgrade to a higher performance level to achieve the throughput you are looking for.
    Thanks,
    Jan 

  • SSRS performance is too slow

    Hi,
    I am using SQL Server 2008 R2 SP1. This simple query, "SELECT CustomerID FROM Customer", returns 42900 records. When It is executed within SQL Server Management Studio (SSMS), it literally takes zero (0) second.  However, when this
    same query is executed within SQL Server Reporting Services (SSRS) to feed a parameter, it takes about two (2) minutes to execute. This same performance issue occurs when the report calls the stored procedure which serves as the main dataset for
    the report_ on SSMS the stored procedure takes under 3 seconds; on SSRS about 7 minutes. 
    Would you please help me Identify the source of the problem, and the possible solution?
    Thank you!

    Thank you all for your comments.
    Andrew,
    The "Executionlog3" view yields this info:
    <AdditionalInfo>
      <ProcessingEngine>2</ProcessingEngine>
      <ScalabilityTime>
        <Pagination>0</Pagination>
        <Processing>0</Processing>
      </ScalabilityTime>
      <EstimatedMemoryUsageKB>
        <Pagination>18</Pagination>
        <Processing>14807</Processing>
      </EstimatedMemoryUsageKB>
      <DataExtension>
        <SQL>1</SQL>
      </DataExtension>
    </AdditionalInfo>
    and
    TimeProcessing = 1768;  TimeRendering = 677; ByteCount = 88353
    The performance issue happens when the report is reading from SQL Server database, even for a simple query such as "SELECT DISTINCT CustomerID FROM Customer".  This exact same query takes literaly zero (0) second to fetch results when run
    directly within SQL Server Management Studio.
    Any other suggestions, please?
    Thank you!

  • IR Report found 1 million record with blob files performance is too slow!

    we are using
    oracle apex 4.2.x
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - Production
    mod_plsql with Apache
    Hardware: HP proliant ML350P
    OS: WINDOWS 2008 R2
    customized content management system developed in apex.when open the IR report have 1 ml rows found and each rows have blob(<5MB as pdf/tiff/bmp/jpg) it will be raising rows in future! but the searching performance is very slow!
    how can increasing the performance?
    how can showing progressing status to user while searching progress going on IR report itself?
    Thanx,
    Ram

    It's impossible to make definitive recommendations on performance improvement based on the limited information provided (in particular the absence of APEX debug traces and SQL execution plans), and lacking knowledge of the application  requirements and access to real data.
    As noted above, this is mainly a matter of data model and application design rather than a problem with APEX.
    Based on what has been made available on apex.oracle.com, taking action on the following points may improve performance.
    I have concerns about the data model. The multiple DMS_TOPMGT_MASTER.NWM_DOC_LVL_0x_COD_NUM columns are indications of incomplete normalization, and the use of the DMS_TOPMGT_DETAILS table hints at an EAV model. Look at normalizing the model so that the WM_DOC_LVL_0x_COD_NUM relationship data can be retrieved using a single join rather than multiple scalar subqueries. Store 1:1 document attributes as column values in DMS_TOPMGT_MASTER rather than rows in DMS_TOPMGT_DETAILS.
    There are no statistics on any of the application tables. Make sure statistics are gathered and kept up to date to enable the optimizer to determine correct execution plans.
    There are no indexes on any of the FK columns or search columns. Create indexes on FK columns to improve join performance, and on searched columns to improve search performance.
    More than 50% of the columns in the report query are hidden and not apparently used anywhere in the report. Why is this? A number of these columns are retrieved using scalar subqueries, which will adversely impact performance in a query processing 1 million+ rows. Remove any unnecessary columns from the report query.
    A number of functions are applied to columns in the report query. These will incur processing time for the functions themselves and context switching overhead in the case of the non-kernel dbms_lob.get_length calls. Remove these function calls from the query and replace them with alternative processing that will not impact query performance, particularly the use of APEX column attributes that will only apply transformations to values that are actually displayed, rather than to all rows processed in the query.
    Remove to_char calls from date columns and format them using date format masks in column attributes.
    Remove decode/case switches. Replace this logic using Display as Text (based on LOV, escape special characters) display types based on appropriate LOVs.
    Remove the dbms_lob.get_length calls. Instead add a file length column to the table, compute the file size when files are added/modified using your application or a trigger, and use this as the BLOB column in the query.
    Searching using the Search Field text box in the APEX interactive report Search Bar generates query like:
    select
    from
      (select
      from
        (...your report query...)
      ) r
      where ((instr(upper("NWM_DOC_REF_NO"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("NWM_DOC_DESC"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("SECTION_NAME"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("CODE_TYPE"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("REF_NUMBER_INDEX"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("DATE_INDEX"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("SUBJECT_INDEX"), upper(:apxws_search_string_1)) > 0
      or instr(upper("NWM_DOC_SERIEL"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("NWM_DOC_DESCRIPTION"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("NWM_DOC_STATUS"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("MIME_TYPE"), upper(:APXWS_SEARCH_STRING_1)) > 0
      or instr(upper("NWM_DOC_FILE_BINARY"), upper(:APXWS_SEARCH_STRING_1)) > 0 ))
      ) r
    where
      rownum <= to_number(:APXWS_MAX_ROW_CNT)
    This will clearly never make use of any available indexes on your table. If you only want users to be able to search using values from 3 columns then remove the Search Field from the Search Bar and only allow users to create explicit filters on those columns. It may then be possible for the optimizer to push the resulting simple predicates down into the inlined report query to make use of indexes on the searched column.
    I have created a copy of your search page on page 33 of your app and created an After Regions page process that will create Debug entries containing the complete IR query and bind variables used so they can be extracted for easier performance analysis and tuning outside of APEX. You can copy this to your local app and modify the page and region ID parameters as required.

  • Help-SD card & performance issues.

    Hi I'm new to these forums and to Blackberry in general. I just recently purchased the Curve 8330 through Verizon. I wanted to store some music as well as photos and maybe some video. I did some research on using an 8GB card and found that my curve is compatible. I bought the card (sandisk) through Amazon. I put about 4GB worth of music on it and put it in the Blackberry. The music works just fine, but now the devise is very slow, about a 1-2 second delay from the time you push a button, type or scroll until it actually performs that particular function. It also seems to be acting a bit strange. I have taken the card out for now and it is working just fine. What gives? Can someone help?
    Solved!
    Go to Solution.

    Hi Bifocals,
    I have the same problem on my BB 8330 v4.3.0.124 (platform 3.1.0.71).  The media card I installed is a Sandisk 4G Micro SDHC.  When the media card is blank, my BB is fine.  After uploading mp3 songs, pictures and video, my BB is very very slow or locks up.
    I tried to pull the battery out and reinserting it to restart the BB.  My BB does a security scan when restarted to read the media card.  Once restarted, it works fine temporary then locks up or operates very very slow.  In order to get my BB operating properly, I reformatted (FAT32) the media card connected through my PC via Desktop Manager.  I tried to upload media on the card again but I ended up with the same locking problem.
    My BB content security is disabled and password is also disabled.  Why is the BB performing a security scan when media is uploaded on the card then locks up or operates like a turtle?
    Thank you,
    Ray 

  • Video Card performance issue

    Hi,
    I've notified a bug regarding the switch between the dedicated and integrated video card in my Mac. Assume that the Intel HD is used. Do CMD + SHIFT + 4 (this is the shortcut to make a screenshoot). The cursor is blinking when I move it. Do the same but assure that the dedicated video card is used: the cursor does not blinking. Do you have the same behavior ?
    Thanks.

    If the MacBook's iPhoto Library hasn't been upgraded by v.11, yes. Do you still have the installer for iPhoto 09? If you do, then delete the iPhoto application, it's associated preference files and the upgraded Library. Install 09, move the Library from the MacBook and then launch iPhoto 09.

  • Crystal Reports performance is too slow

    Dear SDNers,
    I have designed a crystal report which is fetching data from a Z developed function module. From crystal report side, used filters.But while executing report, it is calling the function module multiple times.
    due to this performance is very bad. Why report is calling mutiple times of the function module. Do need to modify from crystal report side or from function module side. Please clarify.
    Regards,
    Venkat

    A similar issue was seen a year ago It was regarding a function module call being executed multiple times from  CR4Ent tool. It involved the usage of sub reports inside that report and the issue was generic for any function module used for testing.
    At that time, the issue was resolved by upgrading to the latest available patch of CR4Ent and also by applying the latest patch at SAP R3 end.
    If you are able to post the exact support package and patch level of CR4Ent and also for the SAP R3 system, then someone can tell whether its the latest or not.
    -Prathamesh

  • Sales & Order report performance is too slow!

    Hi All,
    Sales report is prety slow.It took 90% in OLAP time. I tried RSRT all the possible ways & also created OLAP fill query level also but No result. Please help me.
    Thanks
    Vasu.

    Hi Vasu,
    CAN You please refer below link
    http://wiki.sdn.sap.com/wiki/display/BI/HowtoImproveQueryPerformance-A+Checklist
    it tells explains how you can improve your query performace.
    Also as i told since you are FETCHING DATA direct from MASTER data report will be bit slow so try to get data directly from cube and may be u can use filters also.
    Hope this may help you.
    Regards
    Nilesh

  • Performance issues (excessively slow)

    Acrobat Professional is not running anywhere near as smoothly as one might expect, and I'm looking for a reason why.
    I'm using a Pentium 4 2.4Ghz work computer, on Windows XP Home, version 2002 Service Pack 3, with 1Gb of ram and 55Gb free space. I've installed Acrobat Pro version 9.1.1.
    When simply editing existing .pdf files as part of a catalogue, all I'm doing is copy&paste, drag&drop, insert image, and edit text. Nothing more. But even so, I'm seeing wait times of up to 30 seconds between clicks. After simply pasting a template into the file and dragging it into place, Acrobat freezes up for up to 30 seconds before I'm able to do anything more. Zooming in, something that involves only holding down a key and scrolling a mouse wheel, takes a minimum of 20 seconds.
    I've managed to convince my employers to purchase a more powerful tower, but they need some idea of just how powerful it needs to be. This one is already twice as fast as the recommended system requirements, yet I'm having such trouble.
    Or is there a way to solve this problem without upgrading?

    Personally, I think 1 GB for XP is a minimal system---I don't care what the minimal system requirements are. Additionally, you probably have programs in the background that are taking up precious resources (minimally, you should have a firewall, virus protection, spyware protection, and the acrotray and who knows what else). This all takes precious resources that Acrobat and XP needs.  Additionally, there is something very wrong if your workflow requires you to edit a pdf catlogue via copy & paste, drag and drop, and editing text. This is what most of us here consider extensive editing of a pdf file. All editing should whenever possible be done in the original program---NOT in Acrobat. Yes Acrobat provides the tools but they should be reserved for emergencies. Acrobat is not at DTP, graphics or WP program and should not be treated as such. i bet you will find going back to the original files will be much faster even if you though you need to regenerate the pdf files.

  • BI Server performance too slow

    Hi experts,
    I am facing an issue, my OBIEE sits on a Unix Platform. And many time the server performance is too slow, when I check the processes, Java under Orabi user is using more than 50% of the CPU utilization. I need to find the root cause behind it, as it is the production system and is affecting many users.
    Thanks in advance.
    MT

    Thanks Prassu,
    Yes the Server is performing too much calculations, but they are required as need to cache reports at daily level also,
    I have an observation that the MAX_CACHE_ENTRY_SIZE [MAX_CACHE_ENTRY_SIZE = 1 MB;] parameter is set only for 1 MB, I can understand that if any query fetching more data than 1 MB will not be cached. But is there any chance that this will affect the server as well.
    Thanks
    MT

  • Performance too Slow on SQL Azure box

    Hi,
    Performance is too slow on SQL Azure box:
    Below query returns 500,000 rows in 18 Min. on SQL Azure box (connected via SSMS)
    SELECT * FROM TABLE_1
    Whereas, on local server it returns 500,000 rows in (30 sec.)
    SQL Azure configuration:
    Service Tier/Performance Level : Premium/P1
    DTU       : 100
    MAX DB Size : 500GB     
    Max Worker Threads : 200          
    Max Sessions     : 2400
    Benchmark Transaction Rate      : 105 transactions per second
    Predictability : Best
    Thanks,

    Hello,
    Please refer to the following document too:
    http://download.microsoft.com/download/D/2/0/D20E1C5F-72EA-4505-9F26-FEF9550EFD44/Performance%20Guidance%20for%20SQL%20Server%20in%20Windows%20Azure%20Virtual%20Machines.docx
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • Application response too slow- How to resolve (webLogic)

    Hi All, Can you please let me know what are all the main areas that i need to concentrate in the weblogic application server if i get the complaint from the clients saying "application performance is too slow".
    Edited by: user11361691 on Apr 21, 2010 12:18 AM

    you need to check the following:
    1: Check the server log files to find any errors at the time when the Client is complaining about the server performance.
    2: You can enable the GC logs in the server log and can check whether the there is any issues with the memory usage of the server. From the GC logs you can that whether the time has been taken to to the GC and hence you can try allocating higher memory or you can try changing the GC algorithm according to the analysis.
    For details refer the following link:
    http://download.oracle.com/docs/cd/E13222_01/wls/docs100/perform/topten.html
    http://download.oracle.com/docs/cd/E13222_01/wls/docs100/perform/JVMTuning.html
    3: Finally you can collect the thread dumps snap shots and can see whether there are any stuck threads or deadlock situation in the server at the time of slow performance.

  • Performance issue sdxc card adapter

    Hi,
    I have Macbook (2012). I am using Virtualbox for running Windows XP and Windows 8. Originally I started off saving the virtual harddrives on my internal SSD, but after a while they just consumed to much space on my HD. Thus I moved them to an external harddrive that is via firewire connected to my Apple device. Still everything worked fine and the performance was okay.
    Later I had the need to think of more mobile solution. I thought placing my virtual harddrives on an SD card might be a very mobile idea. So I headed for PhotoFast Memory Expansion Combo Kit (like nifty drive, etc...). That is actually just a tiny adapter for micro sd cards that fits perfect in the sdxd slot of my Mac.
    Anyhow, I moved the virtual disks to a SanDisk 64 gb SDXC Class10 u1 AND since then they perform ugly slow! Why? Do you have any hints regarding that performance issue?
    What I have tried:
    - Reformating the SD card (exFAT, HFS+)
    - Tick "SSD" in the properties of virtual box
    - using a different SDXC adapter in the USB slot
    Nothing helped ;-( Do you have any idea how to improve the perfomance of my internal SDXC reader?
    Thanks in advance.
    Ciao tom

    xl wrote:
    I would just get the readers...
    http://www.bestbuy.com/site/Tripp+Lite+-+USB+3.0+Super+Speed+SDXC+Card+Reader/1304482615.p;?id=mp130...
    http://www.bestbuy.com/site/SYBA+Multimedia+-+USB+3.0+Memory+Card+Reader+Reads+2TB+SDXC%2C+Pocket+Si...
    would i get the sdxc speed? my usb port is 2.0

  • Table size is too big Performance issue.

    Hi,
    Let us assume that we have table which has about 160 columns in it. About 120 of these columns are Varchar data type with about 100-3000 size each column.
    This table also has about 2 Millions Rows in it. I am not sure if these are considered as big sized tables?
    Does tables like these a good representation of data. I am in doubt as the size of the table is very big and might take long time for queries. We have about 10 indexes on this table.
    What kind of precautions have to be taken when these kind of tables are involved in the database and they required for the application.
    Database version is Oracle 10.2.0.4
    i know the question is bit vague, but i am just wondering what needs to be done , and from where i have to start to dig into the issue just in case I get performance issues while trying to select the data or update the data.
    i also want to know if there is any idle size for the tables and any thing that is more than that needs to be treated differently.
    Thanking you
    Rocky

    Any table with more than about 50 columns should be viewed with suspicion. That doesn't mean that there aren't appropriate uses for tables with 120 or 220 columns but it does mean they are reasonably rare.
    What does bother me about your first paragraph is the number of text columns with sizes up to 3K. This is highly indicative of a bad design. One thing is for sure ... no one is writing a report and printing it on anything smaller than a plotter.
    2M rows is small by almost any definition so I wouldn't worry about it. Partitioning is an option but only if partition pruning will can be demonstrated to work with your queries and we haven't seen any of them nor would we have any idea what you might use as a partition key or the type of partitioning so any intelligent discussion of this option would require far more information from you.
    There are no precautions that relate to anything you have written. You've told us nothing about security, usage, transaction volumes, or anything else important to such a consideration.
    What needs to be done, going forward, is for someone that understands normalization to look at this table, examine the business rules, examine the purpose to which it will be put, and most importantly the reports and outputs that will be generated against it, and either justify or change the design. Then with an assessment of the table completed ... you need to run SQL and examine the plans generated using DBMS_XPLAN and timing as compared to your Service Level Agreement (SLA) with the system's customers.

Maybe you are looking for

  • Last Approver name in not appearing in the shopping basket

    Hi  SRM Gurus, We have Implemented N step Badi for our client .During Post Go Live we are facing this issue We have created a shopping basket and there are 3 approvers to approve. After the first approvers is approved the name is not appearing in the

  • Authentication problem in 3.1.2

    Hi, recently we had an upgrade from 10 g to 11g. I exported the application from 10g.but in 11g i am having problem with one application.In this application we are using custom authentication.so from backend(using hash func),i updated password and wh

  • Photoshop CC 14.2 Freeze eventually (every 2/5 mins) from the last update.

    Good afternoon, Photoshop CC It freeze  from the last update (14.2) ,the application remains frozen for 1 or 2 min and runs for another 2 mins till refreezes. MOUNTAIN WORKSTATION i7-4930k 32 Gb Ram Win 8 Nvidia GTX - 780 (Updated 332.21 / 2014-7-1)

  • Insufficient perms/long filename errors

    I formatted a WD external HDD to have an HFS+(for backup using SuperDuper) and 2 FAT partitions. I had been using the hard drive without any errors, and did a reformat of OS X recently. Then, I wiped the three partitions on external HDD. When trying

  • Downloaded Songs From iTunes Won't Play

    I downloaded three songs from iTunes. The songs were downloading, and stalled during processing. I restarted iTunes canceling the downloads. When I went back to re-download, it told me that the songs had already been purchased. I checked for the purc