OwnCloud server: Move data dir to other partition, enable ssl

Hi,
I am using apache and subversion with ssl works just fine, everything configured according to the wiki.
I now set up owncloud server and it works fine on port 80. I want to run it on port 443 over SSL, but I do not know what needs to be added to /etc/httpd/conf/extra/owncloud.conf. Additionaly, I would like the owncloud server to be accessed via a subpath, like localhost/owncload instead of the localhost root dir (in order not to conflict it with svn, which is found at localhost/svn).
The second thing which needs to be fixed is that owncloud shall not store the user files in root partition at /usr/share/webapps/owncloud/data, but in another ext4 partition. I tried creating a symlinkto remove the data folder and create a symlink to a folder on another partition (also did chown http:http there), but owncloud complained about missing data directory then. Moving the whole owncloud directory to another partition and adjusting the paths in owncloud.conf failed too (403). Any idea how to fix that?
This is what my owncloud.conf file looks like right now:
<IfModule mod_alias.c>
Alias /owncloud /usr/share/webapps/owncloud
</IfModule>
<Directory /usr/share/webapps/owncloud>
Options FollowSymlinks
Order allow,deny
AllowOverride all
allow from all
php_admin_value open_basedir "/srv/http/:/home/:/tmp/:/usr/share/pear/:/usr/share/webapps/"
</Directory>
<VirtualHost *:80>
ServerAdmin [email protected]
DocumentRoot /usr/share/webapps/owncloud
ServerName owncloud.foo.com
ErrorLog logs/owncloud.foo.info-error_log
CustomLog logs/owncloud.foo.info-access_log common
</VirtualHost>
Thanks for reading and maybe helping.

I would suggest you remove the Include extra/owncloud.conf and instead put the contents of owncloud.conf directly in httpd.conf, it makes managing it easier.
In owncloud.conf there should be a section something like
<IfModule mod_alias.c>
Alias /cloud /usr/share/webapps/owncloud/
</IfModule>
If that's there, mydomain.com/cloud will point to OwnCloud, so you can comment out the entire VirtualHost part. Then it won't take over the root directory but it will still be accessible via /cloud.
You can put configuration options in the Directory /usr/share/webapps/owncloud part.
To use it on a different drive, I *think* you could just copy /usr/share/webapps/owncloud to the other drive and change the directory to the new one wherever it appears in owncloud.conf. I don't have a second drive to test it on, but that should work.
Last edited by subraizada3 (2014-02-18 15:11:09)

Similar Messages

  • Best practise to move data to new datawarehouse partitions

    Our DW is about 500GB and expected to double in the next three years.
    Largest table has 290m rows, but some fact tables have as little as 1k rows.
    We are also migrating to SQL Server 2012 (2008R2) by building new servers and will split SSRS and DBMS.
    I was thinking I would only partition the larger fact tables, but my dilemma is moving the data to the new servers.
    Trying to avoid having to load each table manually, so would moving the tables to be partitioned to a different database on the current server be a viable option, what about all the current subscriptions and SSRS reports ?
    Then at some point (only 5 years data) we will need to start archiving, so I wanted this physical design to fit in with assisting the archiving process, seems crazy to partition every fact table or is that a better strategy ?

    Hi,
    I will move this thread to SQL Server Data Warehousing forum for further discussion.
    Here is a link to article on partitioning on Microsoft SQL Server:
    Strategies for Partitioning Relational Data Warehouses in Microsoft SQL Server
    http://technet.microsoft.com/en-us/library/cc966457.aspx
    Thanks.
    Tracy Cai
    TechNet Community Support

  • Moving  production server one data center to other

    Hi all,
    Actually we have one Production server all the scenarios are working fine, now we want to creat one new server in another data center and replace the old production server(transfer all the scenarios) .what are the roles(for PI Consultant) involved in this case .
    Please advise me.
    Thanks in advance.

    Build a transport in DEV for all your objects and get basis to apply them and activate the changes in PROD. You shouldnt need more than monitoring access in any production system except in error situations.

  • TM backup: can I use two drives to back up data on two other drives?

    Hi to all!
    I have with me a 1 TB/ Thunderbolt EHD [external hard drive], a Lacie 2 TB/FW, EHD, a Seagate 2 TB/ 2 USB, EHD, a Seagate 3 TB/ 2 USB, EHD and [to be purchased] a Seagate 2 TB/ 2 USB, EHD.
    I mainly use my Mac for making HD videos using FCPX.
    I am planning to use these drives in the following manner as indicated by the table below:
    Drive:
    a
    1 TB,
    Thunderbolt,
    Buffalo
    b
    2 TB/ FW
    LaCie
    c
    2 TB/ 2 usb
    Seagate
    d
    3 TB/ 2 usb
    Seagate
    e
    2 TB/ 2 usb
    Seagate backup
    [to be purchased]
    f
    500 GB, internal
    Macintosh HD
    Partitions
    1
    1
    1
    2
    1
    USAGE:
    editing videos, one
    active project at a time
    mainly for camera
    archives, music files,
    FCP events, projects
    and exported movies
    same as the LaCie,
    as a backup of data
    intend to have 2
    partitions of 1.5 TB each;
    one partition for personal
    data and the other partition
    to back up data on the
    Thunderbolt [current project]
    intend to use this for
    TM back up of my
    Mac's internal HD
    applications, documents etc.
    backed up by:
    one partition of 'd'
    by the 2 TB Seagate
    backup 'e'
    I would like to know whether I could do these things:
    Can I use the TM backup feature in this manner of using 2 separate drive/partition [e/d] to back up data on 2 other drives [f and a]?
    Further, I am manually copying files from 'b' drive to 'c' drive; without incurring further expense, can I ensure that whatever I write onto the LaCie [b] gets copied to the 'c' drive automatically?
    Any thoughts, suggestions and instructions in these matters are welcome.

    EXT. DRIVES:
    a
    1 TB
    Thunderbolt
    b
    2 TB/FW
    [LaCie]
    c
    2 TB/ USB
    [Seagate]
    d
    3 TB/USB
    [Seagate]
    e
    2 TB/USB
    [Seagate]
    USAGE:
    scratch disk
    [planning to have
    only one active
    project at a time]
    camera footage
    FCPX events, projects
    and completed movies
    Same as the LaCie,
    as it's backup
    To be used as TM
    backup for my Mac's
    internal and also to
    the scratch disk.
    Planning to have it in
    2 partitions: to have
    a 'secondary' copy of
    my Mac's IHD on one
    partition and to have
    personal 'net retrievable'
    data on the other partition.
    'MY
    OBSERVATIONS'
    my projects are usually
    less than 10 minutes
    and hence I trust that
    work would be fast and in
    case of drive failure, I
    would be losing only
    one project.
    I am importing camera footage
    primarily into this drive and
    being FW, data transfer is
    fast.
    Am manually copying
    files from the LaCie
    onto this.
    This drive is not a
    brand new one, but
    has been given to me
    by the company's
    agent as a replacement
    to my earlier 2 TB drive
    which had failed.Hope
    this also will not fail!
    Because of the 'uncertain'
    performance of the
    3 TB Seagate, I intend
    to have a copy of my
    Mac's IHD in one
    partition. I intend to place only
    such data upon the other
    partition that I will not be
    greatly affected should the
    drive fail.
    If you can analyse this plan and give your insight upon this matter, I will be immensely happy.
    Further, if this proposed use of the 3 TB Seagate is okay with you, I need to know how to remove the Thunderbolt from doing it's temporary role of TM backup and how to coronate the 3TB Seagate as its successor.
    FYI, I would be switching off TM from functioning whenever I am editing video; I would be switching it on only after a full editing session.
    Another question on my mind is this: do I need to partition the drive 'e' at all?
    Message was edited by: somanna

  • Steps to move Data and Log file for clustered SQL Server

    Hi guys 
    we have Active'passive SQL 2008R2 cluster environment.
    looking for steps to move Data and log files from user Database  and System Database for  SQL Server Clustered Instance. 
    Currently Data and log  files resides on same drive for user and system Databases..
    Thanks
    Please Mark As Answer if it is helpful. \\Aim To Inspire Rather to Teach A.Shah

    Try the below link
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/468de435-3432-45c2-a50b-23519cd2686e/moving-the-system-databases-in-a-sql-cluster?forum=sqldisasterrecovery
    -Prashanth

  • Move data from Citadel to SQL Server 2005

    I wish to move data from a Citadel database. I have tried to use an ODBC connection to a Citadel database with little success. I can connect and write queries using Microsoft's Excel, but what I want to do cannot be done in Excel (due to Excel's row and column limitations and because I need a "heavier" lifter than Excel) . The data-tag names are longer than MS Access will accept and Visual Studio Professional 2005 connects but no queries are successful. MS Access can open the ALIAS table.
    What I want.
    I wish the UTC date and time, Data-Tag Name, and data values between start and end times (logical run) for data contained in the RAWDATA table of the ODBC connection. I need to repeat this for the 400+ data-tags in the Citadel database.
    An ODBC connection is inherently slow and inefficient and thus isn't the connection of choice. However, I don't know of any other means to connect. I have years of data from which to extract runs on multiple DAS Citadel databases.
    Is the Citadel database the best source for this data extraction or is there a "native" LabView database that this extraction process might use for more efficient data extraction?'
    Is there a product marketed by NI that will extract this data to SQL Server or a delimited text file?

    Thanks Amanda:
    Today, I was able to connect from within MS Access via an ADODB connection to the ODBC Citadel database. That is working well so far and seems to be reasonably fast--returned 3 recordsets from the RawData table having about 370; 1,400; and 460,000 records in a little over 16 seconds from a PC with medium power (cpu configuration and RAM availability). The recordsets were ordered but no "WHERE" clauses were involved.
    I'll use these recordsets to vet the algorithm I am developing to identify "runs"--there are 19 complete runs and 1 incomplete run at the end of the Citadel database. The next Citadel database repeats runs 18 and 19 and then has run 20 through whatever. The algorithm currently successfully finds the UTC run start date/time. Tomorrow, I'll work on the run end date/time portion of the algorithm and if all goes well Monday will be the big test of picking the data between the start and end times for each run for each of about 370 data tag names.
    Thanks, again.

  • Move Documents folder to an other partition or drive.

    Hello,
    I want to move some user folders, like Documents, images, music, video and downloads folders, to an other partition or drive. I am using Mac OS X 10.5.2. I would like to move only those folders and not all the Home user folder.
    I would appreciate a lot all the help you could give me. Thank you for your time,
    Marcelo.

    If you move all those folders there won't be much left in your home folder so it might be easier to move the entire home directory. But if you insist you can do this by creating symbolic links.
    here are the steps for moving the Documents folder. the rest are handled similarly. (I would advise you to first try them on a test user account).
    1. Run the following commands in terminal
    *cp -Rp ~/Documents /Volumes/"diskname"*
    Put the name of the disk you are moving it to instead of diskname in the above
    2. make sure that everything copied correctly and run
    *chmod -R -N ~/Documents*
    3. next, delete the original documents folder:
    *rm -rf ~/Documents*
    4. And finally, create a symbolic link.
    *ln -s /Volumes/"diskname"/Documents ~/Documents*
    again, put the name of the disk you moved Documents to in the above.
    I believe, that covers it.
    Message was edited by: V.K.
    P.S. i believe you'll loose the special system icons on the Documents folder when you do this so you might want to copy that icon and paste it onto the folder /Volumes/"diskname"/Documents before deleting the original Documents folder.

  • How to move data files out of Users folder that is in a small partition?

    The small partition (50 GB) also has the System, Applications, Developer and Library folders. I would like to move my Documents, Pictures, etc folders from this small partition to the 200 GB partition, preferably without losing too much performance. Does it work just to physically copy them to the other partition and then delete from the Users folder, or does that create problems? I have read somewhere that this may cause a lot of disk searching delays or other more serious problems. Before I reformat the whole HD to a single partition, is there a faster, less drastic solution? Thanks.

    I presume I need to use "file-level" copying to the one-partition drive, but is there anything else to consider?
    Yes, if you're talking about CCC, then file level is all you can do copying from the drive you'r booted from, but normally best anyway as the copied files will be unfragmented.
    When failures occur, I presume they are due mainly to mechanical failures of the drive mechanism, but do "volume failures" occur very frequently?
    Very astute! It's rarely possible to just lose one partition & not others, but if it is HW failure on the HD, then likely all will be lost.
    or perhaps to a format that can be read by all platforms (other software needed for that?
    Actually, your Intel Mac can boot from an APM partition, you just can't install to it or do Firmware updates from it, but it saved my behind when my only IntelMac died recently & I gad everthing 10.5.8 on an APM Firewire drive, I just plugged it into my eMac & had everything available.
    Of course99.9% of PPC Macs can't boot OSX from USB, & cannot run/boot 10.6.x
    A few options...
    You could format that HDD as Fat32/MS-DOS, but you'd be limited to 4 GB Filesizes.
    NTFS-3G Stable Read/Write Driver...
    http://www.ntfs-3g.org/
    MacFUSE: Full Read-Write NTFS for Mac OS X, Among Others...
    http://www.osnews.com/story/16930
    MacDrive for the PCs... allows them to Read/Write HFS+...
    http://www.mediafour.com/products/macdrive/

  • BI UDI data load conflict using MS SQL Server and date fields

    Hi BW Experts!
    We have found some unexpected problems when we are trying to perform a data extraction process from an SQL database (non-SAP application).
    We are using the BI UDI tool (JDBC Driver/Connector).    The basic data extraction works properly, however we have some specific problems to work with SQL Server fields defined as “Date Time”.
    The JDBC driver automatically intermediate the communication and translate these fields splitting the original SQL date time field into two separated fields with suffix “_D” (for date) and “_T” (for time).
    It works perfect for extraction, but it doesn’t work well when we try to restrict the data selection based on these fields.    
    When we put a date selection into the infopackage data selection area (we already have tried several date formats), the system simply ignores these selection parameters and send a SQL statement without “WHERE” clause to DBMS.   
    Please, anybody has experienced anything like this or know somethings that could help us? 
    This is a standard limitation from SAP BI UDI?
    Thanks in advance and best regards,

    Hi Matt and Thomas
    Many thanks for your replies.
    Yes I have to do the join. Even worse I have to write an Aggregate query like this!
    Select o.F1,o.F2, m.F1,m.F2, count(*)
    from
    table@oracleServer o,
    table@microsoftSQLServer m
    where o.F1=m.F1
    group by o.F1,o.F2, m.F1,m.F2;
    These are 2 different systems (hardware & software) which actually do the same job, they produce transactions for a common business entity. Kind of, both sell tickets for the same ABC ferry company.
    I can not put them side by side on the dashboard, as I need to aggregate the data and present it maybe in a Oracle BI Report for Accounting and Financial reconciliation purposes.
    The truth is, as it is. I can't change it, is too late.
    I have to device a way to harvest both systems as they are. I don't want to move around data. I thought looking at Oracle BI I could write SQL against multiple Data Sources. I am struggling to find a way. As it seems that it can "SQL query" 1 data source at a time.
    I have been hinted on another forum (OracleTURK) to use Oracle Transparent Gateways and Generic Connectivity. http://download.oracle.com/docs/cd/B19306_01/server.102/b14232/toc.htm
    Shame, I have to pay licenses for OWB and Oracle Transparent Gateways. I thought DB vendors can do better. Why do I have to pay to access the other 50% of the market.
    I can understand the performance implications this might have. Because of it I might even be forced at the end to move data (ETL) into a separate database, slice it into partitions and manage it that way. Pitty because currenlty I only need one report out of these systems but seems like will be dear to get.
    Thank you for all your help.
    Kubilay

  • How to move data from a staging table to three entity tables #2

    Environment: SQL Server 2008 R2
    I have a few questions:
    How would I prevent duplicate records, when/ IF SSIS is executed many times?
    How would I know that all huge volume of data being loaded in the entity tables?
    In reference to "how to move data from a staging table to three entity tables ", since I am loading large volume of data, while using lookup transformation:
    which of the merge components is best suited.
    How to configure merge component correctly. (screen shot is preferred) 
    Please refer to the following link
    http://social.msdn.microsoft.com/Forums/en-US/5f2128c8-3ddd-4455-9076-05fa1902a62a/how-to-move-data-from-a-staging-table-to-three-entity-tables?forum=sqlintegrationservices

    You can use RowCount transformation in the path where you want to capture record details. Then inside rowcount transformation pass a integer variable to get count value inside
    the event handler can be configured as below
    Inside Execute SQL task add INSERT statement to add rowcount to your audit table
    Can you also show me how to Check against destination table using key columns inside a lookup task and insert only non
    matched records (No Match output)
    This is explained clearly in below link which Arthur posted
    http://www.sqlis.com/sqlis/post/Get-all-from-Table-A-that-isnt-in-Table-B.aspx
    For large data I would prefer doing this in T-SQL. So what you could do is dump data to staging table and then apply
    T-SQL MERGE between tables (or even a combination of INSERT/UPDATE statements)
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Recently purchased a Lacie Rugged Harddrive yet im struggling to move data from my macpro to the harddrive anybody know where im going wrong?

    After purchasing a lacie drive Im struggling to move data and drop it onto my rugged external harddrive........................infact it wont do it atall?? is there a way of formatting the drive on my mac so it works or is the external hardrive not so rugged afterall??

    Heollo there, first of all plug the disk to a free working port on your Mac (I suppose it's USB2). Then, check if it mounts on the desktop. If not, launch Disk Utility: on the left, you should see the external drive along the startup disk.
    Careful! There is a device tree for each disk! Top: device. Lower: formatted volume.
    You select the top disk icon and select "Partition". Reformat the disk as GUID partition scheme. As a disk format, go for "Mac OS Extented (Journaled)". Let the Mac erase the drive and - at last - it should mount on the desktop.
    If the Mac does not let you erase the disk, or should you not be able to even see the device, launch System Information and check that the disk is being seen by Mac OS X (under USB or FireWire, depending on how you plugged the disk).
    Let us know how it goes.

  • Large data sets and table partitioning

    I have an Oracle 9i database and I need to store huge volume of data in a table,
    so I use a composite partitioning.
    Is there a maximum size for my table to keep performances (if it's possible, I would
    like to store several Terra (1000 Mega) of data)?
    I would like to know if somebody works on a similar project
    and which tools and environment it uses (server, OS, memory
    size..).
    Thanks for your help.

    Yes, users can update data.
    I don't join data.
    This is an OLTP system.
    To estimate the size of 5 Tb, I just use the number of inserts per day and how many days I keep data in the table and I have an average size of one line (taken on an existing first version table).
    I have another question on partitioning : someone says me it is more efficient to drop a whole partition (or subpartition if I use a composite partition) than to delete a line at the same time, for the performances.
    He says me that data access (in my partition) should be very bad if I delete lines progressively (in this partition) instead of keeping the lines and deleting the whole partition when all its lines are no more used.
    What do you think about it?
    Thanks
    Sandrine

  • Best LKM to move data from with in Oracle from one schema to another Schema

    Hi Gurus,
    What is the best KM to move data from one schema to another schema within same oracle database.
    Thanks in advance

    Dear,
    If your source and target are on the same database server then you dont need LKM.
    You have to 1. create one data server for the database server
    2. Create one physical schema for your source and another physical schema for your target under the above created data server.
    3. Then create models for each above created physical schema
    In this case you just need IKM knowledge module
    Please refer http://oditrainings.blogspot.in/2012/08/odi-interface-source-target-on-same.html
    If your source and target are on different server then you must create two different data servers in topology. You have to use LKM.
    The best LKM to use is LKM oracle to Oracle dblink. But you should have proper grants to use it
    If your source has very few records you can go with LKM SQL to Oracle other wise use LKM oracle to Oracle dblink

  • "Searching for movie data" message locking up keynote

    Using Keynote '09 on a Macbook Pro. Importing keynote files from a shared server. Files contain embedded video.
    Never had issues with previous versions of Keynote on same machine. Now, after opening the files, they lock up and a message pop-up says "Searching for movie data in file "filename.mov""
    Does anyone know why this happens? Is there a preferences option that can diasable it? It happens every time I open the file, even after I have let the file previously run its course, which can take over an hour.
    Killing my productivity. HELP!!

    I am also seeing this bug and often work on 400mb+ presentations that use larger screen sizes and have lots of video assets for work.   On large files I find that autosave is not ideal, and would also like a way to turn it off. The computer that was seeing the issue was running 10.7.5.  It was just updated to 10.8.2 so I am keeping my fingers crossed that it works better now.  Time will tell.
    I also really hope that Apple gives us a real refresh on Keynote on the Mac.  As old as it is, it still ahead of Powerpoint.  But it wont stay that way forever standing still and nobody else has a better alternative that is as fast and flexible to work with.  They are squandering somthing special by letting it gather dust.

  • Move Data from Converted Database Objects Successfully Finish ?

    I am using SQL Developer 3.2.20.09 try to migrate KSMMS database from SQL Server 2008 to Oracle 11g R2. After the migration process is done, the Captured Database Objects model, and Converted Database Objects model have been created in the Migration Projects Navigator panel on the left side, and the corresponding sql scripts has been generated in the project output directory. I run the sql scripts, it created all the tables, views, index and stored procedures in the oracle database, everything seems working perfectly. However when I try to Move Data (by right clicking Converted Database Objects) and try to move all the data from SQL Server to Oracle database, the Data moving process run less then 1 minute, and show me the result as Data Move successfully. I have about 1 GB data in the SQL Server database, it seems nothing has been moved into Oracle DB. Here are the detail structures of MS SQL Server Database which I am trying to migrate to Oracle:
    The SQL Server Database name is KSMMS, under that database there are 9 users (azteca, cwweb, dbo, guest, plladmin, pw, pwadmin, tbills, wsdadmin). All my application objects (tables, views, indexes, procedures) are under azteca user, during the migration process, Converted Database Objects creates user azteca_KSMMS and dob_KSMMS, all my application objects have been created under azteca_KSMMS user schema. The generated .sql scripts actually can create all the objects under azteca_KSMMS schema, however when I try to Move Data, nothing has been moved into Oracle database. I opened an SR#3-6708733861 last Friday, it seems Oracle Support can't find what cause the problem during the Data Move process. Any help regarding my questions will be highly appreciated. Thanks.
    Kevin

    I changed Data Move Mode to Online and run the Data Move again. Same Results: Migration action have completed successfully. However no records have been moved into Oracle tables.
    I am running SQL Developer under Windows 8 Operation system. There is no Oracle client software available for Windows 8, does that cause any problems?
    Kevin

Maybe you are looking for

  • Delete message from phone only and not server

    Hello, I had an issue with my mail settings today and had to delete an email account (Hotmail) and set it up again. Upon doing this, there is no longer the option to delete downloaded message from the phone only and keep it on the server. Is that cor

  • In Messages, is it possible to see when a person is typing for Google Talk account?

    In Adium, when in a Google Talk account you can see when a person is typing with a little ... indicator, so you know to wait. In Messages, that's also how it works with AIM chats - you see a chat bubble indicating the person on the other end is typin

  • What actions should be taken while working with  the quality system

    Hi all, The development phase has been completed, I would like to know what are the steps we need to follow when we are working with  the quality system. I have some doubts regarding this could any one give the inputs please. 1. Where should we creat

  • FBL1N - fields not updated

    MODERATOR:  All points have been UNASSIGNED and the thread LOCKED.  Please do not share email addresses or links to SAP copyrighted information on these forums.  If you have some information, please consider posting it to the [Wiki|https://wiki.sdn.s

  • BW 7.3 and ECC 6.05 Problem with BI Content Activation Objects

    Hi Experts, I'm trying to activate the Business Content 0HE (Higher educations) and using the procedure suggested by the document "Campus BW 7 for Content Management". I have done what the manual indicates. 1. Enable data sources in the source system