mirror of
https://github.com/zulip/zulip.git
synced 2025-10-26 09:34:02 +00:00
Compare commits
42 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8a1e20f734 | ||
|
|
93d4c807a9 | ||
|
|
4741e683ce | ||
|
|
d5252ff0c9 | ||
|
|
3d003a8f34 | ||
|
|
1ec0414786 | ||
|
|
fd06380701 | ||
|
|
d345564ce2 | ||
|
|
dbf19ae3e3 | ||
|
|
63437e89d7 | ||
|
|
0a065636c9 | ||
|
|
d61a8c96c5 | ||
|
|
e346044d6a | ||
|
|
b1ff1633b1 | ||
|
|
1b253cb9e0 | ||
|
|
f612274f91 | ||
|
|
a9e6ad5c6a | ||
|
|
eae16d42d4 | ||
|
|
ca221da997 | ||
|
|
c16b252699 | ||
|
|
e3f8108ca6 | ||
|
|
5530fe8cb1 | ||
|
|
fca8479065 | ||
|
|
fe34001dd1 | ||
|
|
9a6b4aeda2 | ||
|
|
76957a62a5 | ||
|
|
76f6d9aaa2 | ||
|
|
5d9eadb734 | ||
|
|
cb8941a081 | ||
|
|
062df3697a | ||
|
|
ad113134c7 | ||
|
|
c4b2e986c3 | ||
|
|
1b49c5658c | ||
|
|
cbdb3d6bbf | ||
|
|
97ccdacb18 | ||
|
|
e96af7906d | ||
|
|
0d1e401922 | ||
|
|
8b599c1ed7 | ||
|
|
a852532c95 | ||
|
|
8e57a3958d | ||
|
|
86046ae9c3 | ||
|
|
c0096932a6 |
@@ -54,7 +54,7 @@ author = 'The Zulip Team'
|
|||||||
# The short X.Y version.
|
# The short X.Y version.
|
||||||
version = '1.8'
|
version = '1.8'
|
||||||
# The full version, including alpha/beta/rc tags.
|
# The full version, including alpha/beta/rc tags.
|
||||||
release = '1.8.0'
|
release = '1.8.1'
|
||||||
|
|
||||||
# This allows us to insert a warning that appears only on an unreleased
|
# This allows us to insert a warning that appears only on an unreleased
|
||||||
# version, e.g. to say that something is likely to have changed.
|
# version, e.g. to say that something is likely to have changed.
|
||||||
|
|||||||
@@ -7,6 +7,21 @@ All notable changes to the Zulip server are documented in this file.
|
|||||||
This section lists notable unreleased changes; it is generally updated
|
This section lists notable unreleased changes; it is generally updated
|
||||||
in bursts.
|
in bursts.
|
||||||
|
|
||||||
|
### 1.8.1 -- 2018-05-07
|
||||||
|
|
||||||
|
- Added an automated tool (`manage.py register_server`) to sign up for
|
||||||
|
the [mobile push notifications service](../production/mobile-push-notifications.html).
|
||||||
|
- Improved rendering of block quotes in mobile push notifications.
|
||||||
|
- Improved some installer error messages.
|
||||||
|
- Fixed several minor bugs with the new Slack import feature.
|
||||||
|
- Fixed several visual bugs with the new compose input pills.
|
||||||
|
- Fixed several minor visual bugs with night mode.
|
||||||
|
- Fixed bug with visual clipping of "g" in the left sidebar.
|
||||||
|
- Fixed an issue with the LDAP backend users' Organization Unit (OU)
|
||||||
|
being cached, resulting in trouble logging in after a user was moved
|
||||||
|
between OUs.
|
||||||
|
- Fixed a couple subtle bugs with muting.
|
||||||
|
|
||||||
### 1.8.0 -- 2018-04-17
|
### 1.8.0 -- 2018-04-17
|
||||||
|
|
||||||
**Highlights:**
|
**Highlights:**
|
||||||
|
|||||||
@@ -18,28 +18,28 @@ support forwarding push notifications to a central push notification
|
|||||||
forwarding service. You can enable this for your Zulip server as
|
forwarding service. You can enable this for your Zulip server as
|
||||||
follows:
|
follows:
|
||||||
|
|
||||||
1. First, contact support@zulipchat.com with the `zulip_org_id` and
|
1. Uncomment the `PUSH_NOTIFICATION_BOUNCER_URL =
|
||||||
`zulip_org_key` values from your `/etc/zulip/zulip-secrets.conf` file, as
|
'https://push.zulipchat.com'` line in your `/etc/zulip/settings.py`
|
||||||
well as a hostname and contact email address you'd like us to use in case
|
file (i.e. remove the `#` at the start of the line), and
|
||||||
of any issues (we hope to have a nice web flow available for this soon).
|
|
||||||
|
|
||||||
2. We'll enable push notifications for your server on our end. Look for a
|
|
||||||
reply from Zulipchat support within 24 hours.
|
|
||||||
|
|
||||||
3. Uncomment the `PUSH_NOTIFICATION_BOUNCER_URL = "https://push.zulipchat.com"`
|
|
||||||
line in your `/etc/zulip/settings.py` file, and
|
|
||||||
[restart your Zulip server](../production/maintain-secure-upgrade.html#updating-settings).
|
[restart your Zulip server](../production/maintain-secure-upgrade.html#updating-settings).
|
||||||
Note that if you installed Zulip older than 1.6, you'll need to add
|
If you installed your Zulip server with a version older than 1.6,
|
||||||
the line (it won't be there to uncomment).
|
you'll need to add the line (it won't be there to uncomment).
|
||||||
|
|
||||||
4. If you or your users have already set up the Zulip mobile app,
|
1. If you're running Zulip 1.8.1 or newer, you can run `manage.py
|
||||||
|
register_server` from `/home/zulip/deployments/current`. This
|
||||||
|
command will print the registration data it would send to the
|
||||||
|
mobile push notifications service, ask you to accept the terms of
|
||||||
|
service, and if you accept, register your server. Otherwise, see
|
||||||
|
the [legacy signup instructions](#legacy-signup).
|
||||||
|
|
||||||
|
1. If you or your users have already set up the Zulip mobile app,
|
||||||
you'll each need to log out and log back in again in order to start
|
you'll each need to log out and log back in again in order to start
|
||||||
getting push notifications.
|
getting push notifications.
|
||||||
|
|
||||||
That should be all you need to do!
|
Congratulations! You've successful setup the service.
|
||||||
|
|
||||||
If you'd like to verify the full pipeline, you can do the following.
|
If you'd like to verify that everything is working, you can do the
|
||||||
Please follow the instructions carefully:
|
following. Please follow the instructions carefully:
|
||||||
|
|
||||||
* [Configure mobile push notifications to always be sent][notification-settings]
|
* [Configure mobile push notifications to always be sent][notification-settings]
|
||||||
(normally they're only sent if you're idle, which isn't ideal for
|
(normally they're only sent if you're idle, which isn't ideal for
|
||||||
@@ -57,9 +57,19 @@ in the Android notification area.
|
|||||||
|
|
||||||
[notification-settings]: https://zulipchat.com/help/configure-mobile-notifications
|
[notification-settings]: https://zulipchat.com/help/configure-mobile-notifications
|
||||||
|
|
||||||
Note that use of the push notification bouncer is subject to the
|
## Updating your server's registration
|
||||||
[Zulipchat Terms of Service](https://zulipchat.com/terms/). By using push
|
|
||||||
notifications, you agree to those terms.
|
Your server's registration includes the server's hostname and contact
|
||||||
|
email address (from `EXTERNAL_HOST` and `ZULIP_ADMINISTRATOR` in
|
||||||
|
`/etc/zulip/settings.py`, aka the `--hostname` and `--email` options
|
||||||
|
in the installer). You can update your server's registration data by
|
||||||
|
running `manage.py register_server` again.
|
||||||
|
|
||||||
|
If you'd like to rotate your server's API key for this service
|
||||||
|
(`zulip_org_key`), you need to use `manage.py register_server
|
||||||
|
--rotate-key` option; it will automatically generate a new
|
||||||
|
`zulip_org_key` and store that new key in
|
||||||
|
`/etc/zulip/zulip-secrets.conf`.
|
||||||
|
|
||||||
## Why this is necessary
|
## Why this is necessary
|
||||||
|
|
||||||
@@ -77,11 +87,22 @@ notification forwarding service, which allows registered Zulip servers
|
|||||||
to send push notifications to the Zulip app indirectly (through the
|
to send push notifications to the Zulip app indirectly (through the
|
||||||
forwarding service).
|
forwarding service).
|
||||||
|
|
||||||
## Security and privacy implications
|
## Security and privacy
|
||||||
|
|
||||||
|
Use of the push notification bouncer is subject to the
|
||||||
|
[Zulipchat Terms of Service](https://zulipchat.com/terms/). By using
|
||||||
|
push notifications, you agree to those terms.
|
||||||
|
|
||||||
We've designed this push notification bouncer service with security
|
We've designed this push notification bouncer service with security
|
||||||
and privacy in mind:
|
and privacy in mind:
|
||||||
|
|
||||||
|
* A central design goal of the the Push Notification Service is to
|
||||||
|
avoid any message content being stored or logged by the service,
|
||||||
|
even in error cases. We store only the necessary metadata for
|
||||||
|
delivering the notifications. This includes the tokens needed to
|
||||||
|
push notifications to the devices, and user ID numbers generated by
|
||||||
|
your Zulip server. These user ID numbers are are opaque to the Push
|
||||||
|
Notification Service, since it has no other data about those users.
|
||||||
* All of the network requests (both from Zulip servers to the Push
|
* All of the network requests (both from Zulip servers to the Push
|
||||||
Notification Service and from the Push Notification Service to the
|
Notification Service and from the Push Notification Service to the
|
||||||
relevant Google and Apple services) are encrypted over the wire with
|
relevant Google and Apple services) are encrypted over the wire with
|
||||||
@@ -89,17 +110,69 @@ and privacy in mind:
|
|||||||
* The code for the push notification forwarding service is 100% open
|
* The code for the push notification forwarding service is 100% open
|
||||||
source and available as part of the
|
source and available as part of the
|
||||||
[Zulip server project on GitHub](https://github.com/zulip/zulip).
|
[Zulip server project on GitHub](https://github.com/zulip/zulip).
|
||||||
The Push Notification Service is designed to avoid any message
|
|
||||||
content being stored or logged, even in error cases.
|
|
||||||
* The push notification forwarding servers are professionally managed
|
* The push notification forwarding servers are professionally managed
|
||||||
by a small team of security experts.
|
by a small team of security expert engineers.
|
||||||
* There's a `PUSH_NOTIFICATION_REDACT_CONTENT` setting available to
|
* If you'd like an extra layer of protection, there's a
|
||||||
disable any message content being sent via the push notification
|
`PUSH_NOTIFICATION_REDACT_CONTENT` setting available to disable any
|
||||||
bouncer (i.e. message content will be replaced with
|
message content being sent via the push notification bouncer
|
||||||
`***REDACTED***`). Note that this setting makes push notifications
|
(i.e. message content will be replaced with `***REDACTED***`). Note
|
||||||
significantly less usable. We plan to
|
that this setting makes push notifications significantly less
|
||||||
|
usable. We plan to
|
||||||
[replace this feature with end-to-end encryption](https://github.com/zulip/zulip/issues/6954)
|
[replace this feature with end-to-end encryption](https://github.com/zulip/zulip/issues/6954)
|
||||||
which would eliminate that usability tradeoff.
|
which would eliminate that usability tradeoff.
|
||||||
|
|
||||||
If you have any questions about the security model, contact
|
If you have any questions about the security model, contact
|
||||||
support@zulipchat.com.
|
support@zulipchat.com.
|
||||||
|
|
||||||
|
## Legacy signup
|
||||||
|
|
||||||
|
Here are the legacy instructions for signing a server up for push
|
||||||
|
notifications:
|
||||||
|
|
||||||
|
1. First, contact support@zulipchat.com with the `zulip_org_id` and
|
||||||
|
`zulip_org_key` values from your `/etc/zulip/zulip-secrets.conf` file, as
|
||||||
|
well as a `hostname` and `contact email` address you'd like us to use in case
|
||||||
|
of any issues (we hope to have a nice web flow available for this soon).
|
||||||
|
|
||||||
|
2. We'll enable push notifications for your server on our end. Look for a
|
||||||
|
reply from Zulipchat support within 24 hours.
|
||||||
|
## Sending push notifications directly from your server
|
||||||
|
|
||||||
|
As we discussed above, it is impossible for a single app in their
|
||||||
|
stores to receive push notifications from multiple, mutually
|
||||||
|
untrusted, servers. The Mobile Push Notification Service is one of
|
||||||
|
the possible solutions to this problem. The other possible solution
|
||||||
|
is for an individual Zulip server's administrators to build and
|
||||||
|
distribute their own copy of the Zulip mobile apps, hardcoding a key
|
||||||
|
that they possess.
|
||||||
|
|
||||||
|
This solution is possible with Zulip, but it requires the server
|
||||||
|
administrators to publish their own copies of
|
||||||
|
the Zulip mobile apps (and there's nothing the Zulip team can do to
|
||||||
|
eliminate this onorous requirement).
|
||||||
|
|
||||||
|
The main work is distributing your own copies of the Zulip mobile apps
|
||||||
|
configured to use APNS/GCM keys that you generate. This is not for
|
||||||
|
the faint of heart! If you haven't done this before, be warned that
|
||||||
|
one can easily spend hundreds of dollars (on things like a DUNS number
|
||||||
|
registration) and a week struggling through the hoops Apple requires
|
||||||
|
to build and distribute an app through the Apple app store, even if
|
||||||
|
you're making no code modifications to an app already present in the
|
||||||
|
store (as would be the case here). The Zulip mobile app also gets
|
||||||
|
frequent updates that you will have to either forgo or republish to
|
||||||
|
the app stores yourself.
|
||||||
|
|
||||||
|
If you've done that work, the Zulip server configuration for sending
|
||||||
|
push notifications through the new app is quite straightforward:
|
||||||
|
* Create a
|
||||||
|
[GCM push notifications](https://developers.google.com/cloud-messaging/android/client)
|
||||||
|
key in the Google Developer console and set `android_gcm_api_key` in
|
||||||
|
`/etc/zulip/zulip-secrets.conf` to that key.
|
||||||
|
* Register for a
|
||||||
|
[mobile push notification certificate][apple-docs]
|
||||||
|
from Apple's developer console. Set `APNS_SANDBOX=False` and
|
||||||
|
`APNS_CERT_FILE` to be the path of your APNS certificate file in
|
||||||
|
`/etc/zulip/settings.py`.
|
||||||
|
* Restart the Zulip server.
|
||||||
|
|
||||||
|
[apple-docs]: https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/APNSOverview.html
|
||||||
|
|||||||
@@ -645,6 +645,13 @@ zrequire('marked', 'third/marked/lib/marked');
|
|||||||
assert(!page_params.never_subscribed);
|
assert(!page_params.never_subscribed);
|
||||||
assert.equal(page_params.notifications_stream, "");
|
assert.equal(page_params.notifications_stream, "");
|
||||||
|
|
||||||
|
// Simulate a private stream the user isn't subscribed to
|
||||||
|
initialize();
|
||||||
|
page_params.realm_notifications_stream_id = 89;
|
||||||
|
stream_data.initialize_from_page_params();
|
||||||
|
assert.equal(page_params.notifications_stream, "");
|
||||||
|
|
||||||
|
// Now actually subscribe the user to the stream
|
||||||
initialize();
|
initialize();
|
||||||
var foo = {
|
var foo = {
|
||||||
name: 'foo',
|
name: 'foo',
|
||||||
@@ -652,7 +659,6 @@ zrequire('marked', 'third/marked/lib/marked');
|
|||||||
};
|
};
|
||||||
|
|
||||||
stream_data.add_sub('foo', foo);
|
stream_data.add_sub('foo', foo);
|
||||||
page_params.realm_notifications_stream_id = 89;
|
|
||||||
stream_data.initialize_from_page_params();
|
stream_data.initialize_from_page_params();
|
||||||
|
|
||||||
assert.equal(page_params.notifications_stream, "foo");
|
assert.equal(page_params.notifications_stream, "foo");
|
||||||
|
|||||||
@@ -80,6 +80,20 @@ class zulip::base {
|
|||||||
owner => 'zulip',
|
owner => 'zulip',
|
||||||
group => 'zulip',
|
group => 'zulip',
|
||||||
}
|
}
|
||||||
|
file { ['/etc/zulip/zulip.conf', '/etc/zulip/settings.py']:
|
||||||
|
ensure => 'file',
|
||||||
|
require => File['/etc/zulip'],
|
||||||
|
mode => '0644',
|
||||||
|
owner => 'zulip',
|
||||||
|
group => 'zulip',
|
||||||
|
}
|
||||||
|
file { '/etc/zulip/zulip-secrets.conf':
|
||||||
|
ensure => 'file',
|
||||||
|
require => File['/etc/zulip'],
|
||||||
|
mode => '0640',
|
||||||
|
owner => 'zulip',
|
||||||
|
group => 'zulip',
|
||||||
|
}
|
||||||
|
|
||||||
file { '/etc/security/limits.conf':
|
file { '/etc/security/limits.conf':
|
||||||
ensure => file,
|
ensure => file,
|
||||||
|
|||||||
@@ -4,6 +4,7 @@ class zulip::postfix_localmail {
|
|||||||
if $fqdn == '' {
|
if $fqdn == '' {
|
||||||
fail("Your system does not have a fully-qualified domain name defined. See hostname(1).")
|
fail("Your system does not have a fully-qualified domain name defined. See hostname(1).")
|
||||||
}
|
}
|
||||||
|
$postfix_mailname = zulipconf("postfix", "mailname", $fqdn)
|
||||||
package { $postfix_packages:
|
package { $postfix_packages:
|
||||||
ensure => "installed",
|
ensure => "installed",
|
||||||
require => File['/etc/mailname'],
|
require => File['/etc/mailname'],
|
||||||
|
|||||||
@@ -20,7 +20,7 @@ alias_maps = hash:/etc/aliases
|
|||||||
alias_database = hash:/etc/aliases
|
alias_database = hash:/etc/aliases
|
||||||
transport_maps = hash:/etc/postfix/transport
|
transport_maps = hash:/etc/postfix/transport
|
||||||
myorigin = /etc/mailname
|
myorigin = /etc/mailname
|
||||||
mydestination = localhost, <%= @fqdn %>
|
mydestination = localhost, <%= @postfix_mailname %>
|
||||||
relayhost =
|
relayhost =
|
||||||
mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128
|
mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128
|
||||||
mailbox_size_limit = 0
|
mailbox_size_limit = 0
|
||||||
|
|||||||
@@ -112,15 +112,20 @@ fi
|
|||||||
# Check early for missing SSL certificates
|
# Check early for missing SSL certificates
|
||||||
if [ "$PUPPET_CLASSES" = "zulip::voyager" ] && [ -z "$USE_CERTBOT""$SELF_SIGNED_CERT" ] && { ! [ -e "/etc/ssl/private/zulip.key" ] || ! [ -e "/etc/ssl/certs/zulip.combined-chain.crt" ]; }; then
|
if [ "$PUPPET_CLASSES" = "zulip::voyager" ] && [ -z "$USE_CERTBOT""$SELF_SIGNED_CERT" ] && { ! [ -e "/etc/ssl/private/zulip.key" ] || ! [ -e "/etc/ssl/certs/zulip.combined-chain.crt" ]; }; then
|
||||||
set +x
|
set +x
|
||||||
echo
|
|
||||||
echo "Could not find SSL certificates!"
|
|
||||||
for f in "/etc/ssl/private/zulip.key" "/etc/ssl/certs/zulip.combined-chain.crt"; do
|
|
||||||
[ -e "$f" ] || echo " - $f is missing!"
|
|
||||||
done
|
|
||||||
cat <<EOF
|
cat <<EOF
|
||||||
|
|
||||||
See https://zulip.readthedocs.io/en/latest/production/ssl-certificates.html for help.
|
No SSL certificate found. One or both required files is missing:
|
||||||
For non-production testing, try the --self-signed-cert option.
|
/etc/ssl/private/zulip.key
|
||||||
|
/etc/ssl/certs/zulip.combined-chain.crt
|
||||||
|
|
||||||
|
Suggested solutions:
|
||||||
|
* For most sites, the --certbot option is recommended.
|
||||||
|
* If you have your own key and cert, see docs linked below
|
||||||
|
for how to install them.
|
||||||
|
* For non-production testing, try the --self-signed-cert option.
|
||||||
|
|
||||||
|
For help and more details, see our SSL documentation:
|
||||||
|
https://zulip.readthedocs.io/en/latest/production/ssl-certificates.html
|
||||||
|
|
||||||
Once fixed, just rerun scripts/setup/install; it'll pick up from here!
|
Once fixed, just rerun scripts/setup/install; it'll pick up from here!
|
||||||
|
|
||||||
|
|||||||
@@ -53,7 +53,7 @@ def generate_django_secretkey():
|
|||||||
|
|
||||||
def get_old_conf(output_filename):
|
def get_old_conf(output_filename):
|
||||||
# type: (str) -> Dict[str, Text]
|
# type: (str) -> Dict[str, Text]
|
||||||
if not os.path.exists(output_filename):
|
if not os.path.exists(output_filename) or os.path.getsize(output_filename) == 0:
|
||||||
return {}
|
return {}
|
||||||
|
|
||||||
secrets_file = configparser.RawConfigParser()
|
secrets_file = configparser.RawConfigParser()
|
||||||
|
|||||||
@@ -148,12 +148,20 @@ exports.maybe_scroll_up_selected_message = function () {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
var selected_row = current_msg_list.selected_row();
|
var selected_row = current_msg_list.selected_row();
|
||||||
|
|
||||||
|
if (selected_row.height() > message_viewport.height() - 100) {
|
||||||
|
// For very tall messages whose height is close to the entire
|
||||||
|
// height of the viewport, don't auto-scroll the viewport to
|
||||||
|
// the end of the message (since that makes it feel annoying
|
||||||
|
// to work with very tall messages). See #8941 for details.
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
var cover = selected_row.offset().top + selected_row.height()
|
var cover = selected_row.offset().top + selected_row.height()
|
||||||
- $("#compose").offset().top;
|
- $("#compose").offset().top;
|
||||||
if (cover > 0) {
|
if (cover > 0) {
|
||||||
message_viewport.user_initiated_animate_scroll(cover+5);
|
message_viewport.user_initiated_animate_scroll(cover+5);
|
||||||
}
|
}
|
||||||
|
|
||||||
};
|
};
|
||||||
|
|
||||||
function fill_in_opts_from_current_narrowed_view(msg_type, opts) {
|
function fill_in_opts_from_current_narrowed_view(msg_type, opts) {
|
||||||
|
|||||||
@@ -253,11 +253,11 @@ exports.update_messages = function update_messages(events) {
|
|||||||
// propagated edits to be updated (since the topic edits can have
|
// propagated edits to be updated (since the topic edits can have
|
||||||
// changed the correct grouping of messages).
|
// changed the correct grouping of messages).
|
||||||
if (topic_edited) {
|
if (topic_edited) {
|
||||||
home_msg_list.rerender();
|
home_msg_list.update_muting_and_rerender();
|
||||||
// However, we don't need to rerender message_list.narrowed if
|
// However, we don't need to rerender message_list.narrowed if
|
||||||
// we just changed the narrow earlier in this function.
|
// we just changed the narrow earlier in this function.
|
||||||
if (!changed_narrow && current_msg_list === message_list.narrowed) {
|
if (!changed_narrow && current_msg_list === message_list.narrowed) {
|
||||||
message_list.narrowed.rerender();
|
message_list.narrowed.update_muting_and_rerender();
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
// If the content of the message was edited, we do a special animation.
|
// If the content of the message was edited, we do a special animation.
|
||||||
|
|||||||
@@ -30,11 +30,17 @@ function process_result(data, opts) {
|
|||||||
_.each(messages, message_store.set_message_booleans);
|
_.each(messages, message_store.set_message_booleans);
|
||||||
messages = _.map(messages, message_store.add_message_metadata);
|
messages = _.map(messages, message_store.add_message_metadata);
|
||||||
|
|
||||||
|
// In case any of the newly fetched messages are new, add them to
|
||||||
|
// our unread data structures. It's important that this run even
|
||||||
|
// when fetching in a narrow, since we might return unread
|
||||||
|
// messages that aren't in the home view data set (e.g. on a muted
|
||||||
|
// stream).
|
||||||
|
message_util.do_unread_count_updates(messages);
|
||||||
|
|
||||||
// If we're loading more messages into the home view, save them to
|
// If we're loading more messages into the home view, save them to
|
||||||
// the message_list.all as well, as the home_msg_list is reconstructed
|
// the message_list.all as well, as the home_msg_list is reconstructed
|
||||||
// from message_list.all.
|
// from message_list.all.
|
||||||
if (opts.msg_list === home_msg_list) {
|
if (opts.msg_list === home_msg_list) {
|
||||||
message_util.do_unread_count_updates(messages);
|
|
||||||
message_util.add_messages(messages, message_list.all, {messages_are_new: false});
|
message_util.add_messages(messages, message_list.all, {messages_are_new: false});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -564,11 +564,10 @@ exports.MessageList.prototype = {
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
|
||||||
rerender_after_muting_changes: function MessageList_rerender_after_muting_changes() {
|
update_muting_and_rerender: function MessageList_update_muting_and_rerender() {
|
||||||
if (!this.muting_enabled) {
|
if (!this.muting_enabled) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
this._items = this.unmuted_messages(this._all_items);
|
this._items = this.unmuted_messages(this._all_items);
|
||||||
this.rerender();
|
this.rerender();
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -15,9 +15,11 @@ exports.rerender = function () {
|
|||||||
// re-doing a mute or unmute is a pretty recoverable thing.
|
// re-doing a mute or unmute is a pretty recoverable thing.
|
||||||
|
|
||||||
stream_list.update_streams_sidebar();
|
stream_list.update_streams_sidebar();
|
||||||
current_msg_list.rerender_after_muting_changes();
|
if (current_msg_list.muting_enabled) {
|
||||||
|
current_msg_list.update_muting_and_rerender();
|
||||||
|
}
|
||||||
if (current_msg_list !== home_msg_list) {
|
if (current_msg_list !== home_msg_list) {
|
||||||
home_msg_list.rerender_after_muting_changes();
|
home_msg_list.update_muting_and_rerender();
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
@@ -518,8 +518,15 @@ exports.initialize_from_page_params = function () {
|
|||||||
// Migrate the notifications stream from the new API structure to
|
// Migrate the notifications stream from the new API structure to
|
||||||
// what the frontend expects.
|
// what the frontend expects.
|
||||||
if (page_params.realm_notifications_stream_id !== -1) {
|
if (page_params.realm_notifications_stream_id !== -1) {
|
||||||
page_params.notifications_stream =
|
var notifications_stream_obj =
|
||||||
exports.get_sub_by_id(page_params.realm_notifications_stream_id).name;
|
exports.get_sub_by_id(page_params.realm_notifications_stream_id);
|
||||||
|
if (notifications_stream_obj) {
|
||||||
|
// This happens when the notifications stream is a private
|
||||||
|
// stream the current user is not subscribed to.
|
||||||
|
page_params.notifications_stream = notifications_stream_obj.name;
|
||||||
|
} else {
|
||||||
|
page_params.notifications_stream = "";
|
||||||
|
}
|
||||||
} else {
|
} else {
|
||||||
page_params.notifications_stream = "";
|
page_params.notifications_stream = "";
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -69,6 +69,7 @@
|
|||||||
padding: 0px;
|
padding: 0px;
|
||||||
display: flex;
|
display: flex;
|
||||||
align-items: center;
|
align-items: center;
|
||||||
|
width: 100%;
|
||||||
}
|
}
|
||||||
|
|
||||||
.compose_table .right_part .icon-vector-narrow {
|
.compose_table .right_part .icon-vector-narrow {
|
||||||
@@ -82,25 +83,18 @@
|
|||||||
}
|
}
|
||||||
|
|
||||||
.compose_table .pm_recipient {
|
.compose_table .pm_recipient {
|
||||||
margin: 0px 20px 0px 10px;
|
margin-left: 5px;
|
||||||
|
margin-right: 20px;
|
||||||
display: flex;
|
display: flex;
|
||||||
|
width: 100%;
|
||||||
}
|
}
|
||||||
|
|
||||||
.compose_table #private-message .to_text {
|
.compose_table #private-message .to_text {
|
||||||
width: 65px;
|
vertical-align: middle;
|
||||||
vertical-align: top;
|
|
||||||
|
|
||||||
font-weight: 600;
|
font-weight: 600;
|
||||||
}
|
}
|
||||||
|
|
||||||
.compose_table #private-message .to_text span {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
|
|
||||||
position: relative;
|
|
||||||
top: -1px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.compose_table #compose-lock-icon {
|
.compose_table #compose-lock-icon {
|
||||||
position: relative;
|
position: relative;
|
||||||
left: 5px;
|
left: 5px;
|
||||||
@@ -179,7 +173,6 @@ table.compose_table {
|
|||||||
display: none;
|
display: none;
|
||||||
position: absolute;
|
position: absolute;
|
||||||
right: 0px;
|
right: 0px;
|
||||||
top: 5px;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
#compose_invite_users,
|
#compose_invite_users,
|
||||||
@@ -393,7 +386,7 @@ input.recipient_box {
|
|||||||
|
|
||||||
#stream-message,
|
#stream-message,
|
||||||
#private-message {
|
#private-message {
|
||||||
display: none;
|
display: flex;
|
||||||
}
|
}
|
||||||
|
|
||||||
.compose_table .drafts-link {
|
.compose_table .drafts-link {
|
||||||
|
|||||||
@@ -109,7 +109,7 @@
|
|||||||
opacity: 0.5;
|
opacity: 0.5;
|
||||||
}
|
}
|
||||||
|
|
||||||
.pm_recipient .pill-container .pill + .input:focus:empty::before {
|
.pm_recipient .pill-container .pill + .input:empty::before {
|
||||||
content: attr(data-some-recipients-text);
|
content: attr(data-some-recipients-text);
|
||||||
opacity: 0.5;
|
opacity: 0.5;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -204,7 +204,6 @@ li.active-sub-filter {
|
|||||||
.conversation-partners,
|
.conversation-partners,
|
||||||
.topic-name {
|
.topic-name {
|
||||||
display: block;
|
display: block;
|
||||||
line-height: 1.1;
|
|
||||||
width: calc(100% - 5px);
|
width: calc(100% - 5px);
|
||||||
white-space: nowrap;
|
white-space: nowrap;
|
||||||
overflow: hidden;
|
overflow: hidden;
|
||||||
@@ -212,6 +211,11 @@ li.active-sub-filter {
|
|||||||
padding-right: 2px;
|
padding-right: 2px;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.topic-name {
|
||||||
|
/* TODO: We should figure out how to remove this without changing the spacing */
|
||||||
|
line-height: 1.1;
|
||||||
|
}
|
||||||
|
|
||||||
.left-sidebar li a.topic-name:hover {
|
.left-sidebar li a.topic-name:hover {
|
||||||
text-decoration: underline;
|
text-decoration: underline;
|
||||||
}
|
}
|
||||||
@@ -312,7 +316,6 @@ ul.filters li.out_of_home_view li.muted_topic {
|
|||||||
display: inline-block;
|
display: inline-block;
|
||||||
overflow: hidden;
|
overflow: hidden;
|
||||||
text-overflow: ellipsis;
|
text-overflow: ellipsis;
|
||||||
line-height: 1.1;
|
|
||||||
position: relative;
|
position: relative;
|
||||||
width: 100%;
|
width: 100%;
|
||||||
padding-right: 2px;
|
padding-right: 2px;
|
||||||
@@ -369,7 +372,7 @@ li.expanded_private_message {
|
|||||||
}
|
}
|
||||||
|
|
||||||
li.expanded_private_message a {
|
li.expanded_private_message a {
|
||||||
margin: 2px 0px;
|
margin: 1px 0px;
|
||||||
}
|
}
|
||||||
|
|
||||||
.show-all-streams a {
|
.show-all-streams a {
|
||||||
|
|||||||
@@ -99,6 +99,11 @@ body.night-mode .private_message_count {
|
|||||||
background-color: hsla(105, 2%, 50%, 0.5);
|
background-color: hsla(105, 2%, 50%, 0.5);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
body.night-mode .pill-container {
|
||||||
|
border-style: solid;
|
||||||
|
border-width: 1px;
|
||||||
|
}
|
||||||
|
|
||||||
body.night-mode .pm_recipient .pill-container .pill {
|
body.night-mode .pm_recipient .pill-container .pill {
|
||||||
color: inherit;
|
color: inherit;
|
||||||
border: 1px solid hsla(0, 0%, 0%, 0.50);
|
border: 1px solid hsla(0, 0%, 0%, 0.50);
|
||||||
@@ -210,10 +215,12 @@ body.night-mode .popover.right .arrow {
|
|||||||
border-right-color: hsl(235, 18%, 7%);
|
border-right-color: hsl(235, 18%, 7%);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
body.night-mode .close,
|
||||||
body.night-mode ul.filters li:hover .arrow {
|
body.night-mode ul.filters li:hover .arrow {
|
||||||
color: hsl(236, 33%, 80%);
|
color: hsl(236, 33%, 80%);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
body.night-mode .close:hover,
|
||||||
body.night-mode ul.filters li .arrow:hover {
|
body.night-mode ul.filters li .arrow:hover {
|
||||||
color: hsl(0, 0%, 100%);
|
color: hsl(0, 0%, 100%);
|
||||||
}
|
}
|
||||||
@@ -478,3 +485,10 @@ body.night-mode .ps__rail-y {
|
|||||||
background-color: hsla(0, 0%, 0%, 0.2);
|
background-color: hsla(0, 0%, 0%, 0.2);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
body.night-mode #bots_lists_navbar .active a {
|
||||||
|
color: #ddd;
|
||||||
|
background-color: hsl(212, 28%, 18%);
|
||||||
|
border-color: #ddd;
|
||||||
|
border-bottom-color: transparent;
|
||||||
|
}
|
||||||
|
|||||||
@@ -135,7 +135,7 @@ label {
|
|||||||
|
|
||||||
.wrapped-table {
|
.wrapped-table {
|
||||||
table-layout: fixed;
|
table-layout: fixed;
|
||||||
word-break: break-all;
|
word-break: break-word;
|
||||||
word-wrap: break-word;
|
word-wrap: break-word;
|
||||||
white-space: -moz-pre-wrap !important;
|
white-space: -moz-pre-wrap !important;
|
||||||
white-space: -webkit-pre-wrap;
|
white-space: -webkit-pre-wrap;
|
||||||
@@ -583,7 +583,11 @@ input[type=checkbox].inline-block {
|
|||||||
width: 13px;
|
width: 13px;
|
||||||
height: 20px;
|
height: 20px;
|
||||||
margin: 0;
|
margin: 0;
|
||||||
margin-right: 3px;
|
}
|
||||||
|
|
||||||
|
/* make the spinner green like the text and box. */
|
||||||
|
#settings_page .alert-notification .loading_indicator_spinner svg path {
|
||||||
|
fill: hsl(178, 100%, 40%);
|
||||||
}
|
}
|
||||||
|
|
||||||
#settings_page .alert-notification .loading_indicator_text {
|
#settings_page .alert-notification .loading_indicator_text {
|
||||||
|
|||||||
@@ -43,6 +43,13 @@ hr {
|
|||||||
border-width: 2px;
|
border-width: 2px;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.rangeslider-container {
|
||||||
|
-webkit-user-select: none;
|
||||||
|
-moz-user-select: none;
|
||||||
|
-ms-user-select: none;
|
||||||
|
user-select: none;
|
||||||
|
}
|
||||||
|
|
||||||
.rangeselector text {
|
.rangeselector text {
|
||||||
font-weight: 400;
|
font-weight: 400;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -893,6 +893,7 @@ td.pointer {
|
|||||||
border-radius: 3px 0px 0px 0px;
|
border-radius: 3px 0px 0px 0px;
|
||||||
/* box-shadow: 0px 2px 3px hsl(0, 0%, 80%); */
|
/* box-shadow: 0px 2px 3px hsl(0, 0%, 80%); */
|
||||||
box-shadow: inset 0px 2px 1px -2px hsl(0, 0%, 20%), inset 2px 0px 1px -2px hsl(0, 0%, 20%) !important;
|
box-shadow: inset 0px 2px 1px -2px hsl(0, 0%, 20%), inset 2px 0px 1px -2px hsl(0, 0%, 20%) !important;
|
||||||
|
width: 10px;
|
||||||
}
|
}
|
||||||
|
|
||||||
.summary_row_private_message .summary_colorblock {
|
.summary_row_private_message .summary_colorblock {
|
||||||
|
|||||||
@@ -8,7 +8,7 @@
|
|||||||
<span></span>
|
<span></span>
|
||||||
</label>
|
</label>
|
||||||
<label for="{{prefix}}{{setting_name}}" class="inline-block" id="{{prefix}}{{setting_name}}_label">
|
<label for="{{prefix}}{{setting_name}}" class="inline-block" id="{{prefix}}{{setting_name}}_label">
|
||||||
{{label}}
|
{{{label}}}
|
||||||
</label>
|
</label>
|
||||||
{{{end_content}}}
|
{{{end_content}}}
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -56,80 +56,58 @@
|
|||||||
<button type="button" class="close" id='compose_close' title="{{ _('Cancel compose') }} (Esc)">×</button>
|
<button type="button" class="close" id='compose_close' title="{{ _('Cancel compose') }} (Esc)">×</button>
|
||||||
<form id="send_message_form" action="/json/messages" method="post">
|
<form id="send_message_form" action="/json/messages" method="post">
|
||||||
{{ csrf_input }}
|
{{ csrf_input }}
|
||||||
<table class="compose_table">
|
<div class="compose_table">
|
||||||
<tbody>
|
<div id="stream-message">
|
||||||
<tr class="ztable_layout_row">
|
<div class="message_header_colorblock message_header_stream left_part"></div>
|
||||||
<td class="ztable_comp_col1" />
|
<div class="right_part">
|
||||||
<td class="ztable_comp_col2" />
|
<span id="compose-lock-icon">
|
||||||
</tr>
|
<i class="icon-vector-lock" title="{{ _('This is an invite-only stream') }}"></i>
|
||||||
<tr id="stream-message">
|
</span>
|
||||||
<td class="message_header_colorblock message_header_stream left_part">
|
<input type="text" class="recipient_box" name="stream" id="stream" maxlength="30" value="" placeholder="{{ _('Stream') }}" autocomplete="off" tabindex="0" aria-label="{{ _('Stream') }}" />
|
||||||
</td>
|
<i class="icon-vector-narrow icon-vector-small"></i>
|
||||||
<td class="right_part">
|
<input type="text" class="recipient_box" name="subject" id="subject" maxlength="60" value="" placeholder="{{ _('Topic') }}" autocomplete="off" tabindex="0" aria-label="{{ _('Topic') }}" />
|
||||||
<span id="compose-lock-icon">
|
</div>
|
||||||
<i class="icon-vector-lock" title="{{ _('This is an invite-only stream') }}"></i>
|
</div>
|
||||||
</span>
|
<div id="private-message">
|
||||||
<input type="text" class="recipient_box" name="stream" id="stream"
|
<div class="to_text">
|
||||||
maxlength="30"
|
<span>{{ _('To') }}:</span>
|
||||||
value="" placeholder="{{ _('Stream') }}" autocomplete="off" tabindex="0" aria-label="{{ _('Stream') }}"/>
|
</div>
|
||||||
<i class="icon-vector-narrow icon-vector-small"></i>
|
<div class="right_part">
|
||||||
<input type="text" class="recipient_box" name="subject" id="subject"
|
<div class="pm_recipient">
|
||||||
maxlength="60"
|
<div class="pill-container" data-before="{{ _('You and') }}">
|
||||||
value="" placeholder="{{ _('Topic') }}" autocomplete="off" tabindex="0" aria-label="{{ _('Topic') }}"/>
|
<div class="input" contenteditable="true" id="private_message_recipient" name="recipient" data-no-recipients-text="{{ _('Add one or more users') }}" data-some-recipients-text="{{ _('Add another user...') }}"></div>
|
||||||
</td>
|
|
||||||
</tr>
|
|
||||||
<tr id="private-message">
|
|
||||||
<td class="to_text">
|
|
||||||
<span>{{ _('To') }}:</span>
|
|
||||||
</td>
|
|
||||||
<td class="right_part">
|
|
||||||
<div class="pm_recipient">
|
|
||||||
<div class="pill-container" data-before="{{ _('You and') }}">
|
|
||||||
<div class="input" contenteditable="true" id="private_message_recipient" name="recipient"
|
|
||||||
data-no-recipients-text="{{ _('Add one or more users') }}" data-some-recipients-text="{{ _('Add another user...') }}"></div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</td>
|
</div>
|
||||||
</tr>
|
</div>
|
||||||
<tr>
|
</div>
|
||||||
<td class="messagebox" colspan="2">
|
<div>
|
||||||
<textarea class="new_message_textarea" name="content" id='compose-textarea'
|
<div class="messagebox" colspan="2">
|
||||||
value="" placeholder="{{ _('Compose your message here') }}" tabindex="0" maxlength="10000" aria-label="{{ _('Compose your message here...') }}"></textarea>
|
<textarea class="new_message_textarea" name="content" id='compose-textarea' value="" placeholder="{{ _('Compose your message here') }}" tabindex="0" maxlength="10000" aria-label="{{ _('Compose your message here...') }}"></textarea>
|
||||||
<div class="scrolling_list" id="preview_message_area" style="display:none;">
|
<div class="scrolling_list" id="preview_message_area" style="display:none;">
|
||||||
<div id="markdown_preview_spinner"></div>
|
<div id="markdown_preview_spinner"></div>
|
||||||
<div id="preview_content"></div>
|
<div id="preview_content"></div>
|
||||||
|
</div>
|
||||||
|
<div class="drag"></div>
|
||||||
|
<div id="below-compose-content">
|
||||||
|
<input type="file" id="file_input" class="notvisible pull-left" multiple />
|
||||||
|
<a class="message-control-button icon-vector-smile" id="emoji_map" href="#" title="{{ _('Add emoji') }}"></a>
|
||||||
|
<a class="message-control-button icon-vector-font" title="{{ _('Formatting') }}" data-overlay-trigger="markdown-help"></a>
|
||||||
|
<a class="message-control-button icon-vector-paper-clip notdisplayed" id="attach_files" href="#" title="{{ _('Attach files') }}"></a> {% if jitsi_server_url %}
|
||||||
|
<a class="message-control-button fa fa-video-camera" id="video_link" href="#" title="{{ _('Add video call') }}"></a> {% endif %}
|
||||||
|
<a id="undo_markdown_preview" class="message-control-button icon-vector-edit" style="display:none;" title="{{ _('Write') }}"></a>
|
||||||
|
<a id="markdown_preview" class="message-control-button icon-vector-eye-open" title="{{ _('Preview') }}"></a>
|
||||||
|
<a class="drafts-link" href="#drafts" title="{{ _('Drafts') }} (d)">{{ _('Drafts') }}</a>
|
||||||
|
<span id="sending-indicator"></span>
|
||||||
|
<div id="send_controls" class="new-style">
|
||||||
|
<label id="enter-sends-label" class="compose_checkbox_label">
|
||||||
|
<input type="checkbox" id="enter_sends" />{{ _('Press Enter to send') }}
|
||||||
|
</label>
|
||||||
|
<button type="submit" id="compose-send-button" class="button small send_message" tabindex="150" title="{{ _('Send') }} (Ctrl + Enter)">{{ _('Send') }}</button>
|
||||||
</div>
|
</div>
|
||||||
<div class="drag"></div>
|
</div>
|
||||||
<div id="below-compose-content">
|
</div>
|
||||||
<input type="file" id="file_input" class="notvisible pull-left" multiple />
|
</div>
|
||||||
<a class="message-control-button icon-vector-smile"
|
</div>
|
||||||
id="emoji_map" href="#" title="{{ _('Add emoji') }}"></a>
|
|
||||||
<a class="message-control-button icon-vector-font"
|
|
||||||
title="{{ _('Formatting') }}" data-overlay-trigger="markdown-help"></a>
|
|
||||||
<a class="message-control-button icon-vector-paper-clip notdisplayed"
|
|
||||||
id="attach_files" href="#" title="{{ _('Attach files') }}"></a>
|
|
||||||
{% if jitsi_server_url %}
|
|
||||||
<a class="message-control-button fa fa-video-camera"
|
|
||||||
id="video_link" href="#" title="{{ _('Add video call') }}"></a>
|
|
||||||
{% endif %}
|
|
||||||
<a id="undo_markdown_preview"
|
|
||||||
class="message-control-button icon-vector-edit"
|
|
||||||
style="display:none;" title="{{ _('Write') }}"></a>
|
|
||||||
<a id="markdown_preview" class="message-control-button icon-vector-eye-open"
|
|
||||||
title="{{ _('Preview') }}"></a>
|
|
||||||
<a class="drafts-link" href="#drafts" title="{{ _('Drafts') }} (d)">{{ _('Drafts') }}</a>
|
|
||||||
<span id="sending-indicator"></span>
|
|
||||||
<div id="send_controls" class="new-style">
|
|
||||||
<label id="enter-sends-label" class="compose_checkbox_label">
|
|
||||||
<input type="checkbox" id="enter_sends" />{{ _('Press Enter to send') }}
|
|
||||||
</label>
|
|
||||||
<button type="submit" id="compose-send-button" class="button small send_message" tabindex="150" title="{{ _('Send') }} (Ctrl + Enter)">{{ _('Send') }}</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</td>
|
|
||||||
</tr>
|
|
||||||
</tbody>
|
|
||||||
</table>
|
|
||||||
</form>
|
</form>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -26,7 +26,7 @@ Log in to your Zulip server as the `zulip` user. Run the following
|
|||||||
commands, replacing `<token>` with the value generated above:
|
commands, replacing `<token>` with the value generated above:
|
||||||
|
|
||||||
```
|
```
|
||||||
cd ~/zulip
|
cd /home/zulip/deployments/current
|
||||||
./manage.py convert_slack_data slack_data.zip --token <token> --output converted_slack_data
|
./manage.py convert_slack_data slack_data.zip --token <token> --output converted_slack_data
|
||||||
./manage.py import --destroy-rebuild-database '' converted_slack_data
|
./manage.py import --destroy-rebuild-database '' converted_slack_data
|
||||||
```
|
```
|
||||||
@@ -42,11 +42,34 @@ commands, replacing `<token>` with the value generated above, and
|
|||||||
Zulip organization.
|
Zulip organization.
|
||||||
|
|
||||||
```
|
```
|
||||||
cd ~/zulip
|
cd /home/zulip/deployments/current
|
||||||
./manage.py convert_slack_data slack_data.zip --token <token> --output converted_slack_data
|
./manage.py convert_slack_data slack_data.zip --token <token> --output converted_slack_data
|
||||||
./manage.py import --import-into-nonempty <subdomain> converted_slack_data
|
./manage.py import --import-into-nonempty <subdomain> converted_slack_data
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Logging in
|
||||||
|
|
||||||
|
Once the import completes, all your users will have accounts in your
|
||||||
|
new Zulip organization, but those accounts won't have passwords yet
|
||||||
|
(since for very good security reasons, passwords are not exported).
|
||||||
|
Your users will need to either authenticate using something like
|
||||||
|
Google auth, or start by resetting their passwords.
|
||||||
|
|
||||||
|
You can use the `./manage.py send_password_reset_email` command to
|
||||||
|
send password reset emails to your users. We
|
||||||
|
recommend starting with sending one to yourself for testing:
|
||||||
|
|
||||||
|
```
|
||||||
|
./manage.py send_password_reset_email -u username@example.com
|
||||||
|
```
|
||||||
|
|
||||||
|
and then once you're ready, you can email them to everyone using e.g.
|
||||||
|
```
|
||||||
|
./manage.py send_password_reset_email -r '' --all-users
|
||||||
|
```
|
||||||
|
|
||||||
|
(replace `''` with your subdomain if you're using one).
|
||||||
|
|
||||||
## Caveats
|
## Caveats
|
||||||
|
|
||||||
- Slack doesn't export private channels or direct messages unless you pay
|
- Slack doesn't export private channels or direct messages unless you pay
|
||||||
|
|||||||
@@ -55,7 +55,7 @@
|
|||||||
</div>
|
</div>
|
||||||
<div id="stream-filters-container" class="scrolling_list">
|
<div id="stream-filters-container" class="scrolling_list">
|
||||||
<div class="input-append notdisplayed">
|
<div class="input-append notdisplayed">
|
||||||
<input class="stream-list-filter" type="text" placeholder="{{ _('Search streams') }}" />
|
<input class="stream-list-filter" type="text" autocomplete="off" placeholder="{{ _('Search streams') }}" />
|
||||||
<button type="button" class="btn clear_search_button" id="clear_search_stream_button">
|
<button type="button" class="btn clear_search_button" id="clear_search_stream_button">
|
||||||
<i class="icon-vector-remove"></i>
|
<i class="icon-vector-remove"></i>
|
||||||
</button>
|
</button>
|
||||||
|
|||||||
@@ -13,7 +13,7 @@
|
|||||||
<i id="user_filter_icon" class='fa fa-search' aria-label="{{ _('Filter users') }}" data-toggle="tooltip" title="{{ _('Filter users') }} (w)"></i>
|
<i id="user_filter_icon" class='fa fa-search' aria-label="{{ _('Filter users') }}" data-toggle="tooltip" title="{{ _('Filter users') }} (w)"></i>
|
||||||
</div>
|
</div>
|
||||||
<div class="input-append notdisplayed">
|
<div class="input-append notdisplayed">
|
||||||
<input class="user-list-filter" type="text" placeholder="{{ _('Search people') }}" />
|
<input class="user-list-filter" type="text" autocomplete="off" placeholder="{{ _('Search people') }}" />
|
||||||
<button type="button" class="btn clear_search_button" id="clear_search_people_button">
|
<button type="button" class="btn clear_search_button" id="clear_search_people_button">
|
||||||
<i class="icon-vector-remove"></i>
|
<i class="icon-vector-remove"></i>
|
||||||
</button>
|
</button>
|
||||||
|
|||||||
@@ -78,6 +78,7 @@ not_yet_fully_covered = {
|
|||||||
'zerver/lib/feedback.py',
|
'zerver/lib/feedback.py',
|
||||||
'zerver/lib/fix_unreads.py',
|
'zerver/lib/fix_unreads.py',
|
||||||
'zerver/lib/html_diff.py',
|
'zerver/lib/html_diff.py',
|
||||||
|
'zerver/lib/import_realm.py',
|
||||||
'zerver/lib/logging_util.py',
|
'zerver/lib/logging_util.py',
|
||||||
'zerver/lib/migrate.py',
|
'zerver/lib/migrate.py',
|
||||||
'zerver/lib/outgoing_webhook.py',
|
'zerver/lib/outgoing_webhook.py',
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
ZULIP_VERSION = "1.8.0"
|
ZULIP_VERSION = "1.8.1"
|
||||||
|
|
||||||
# Bump the minor PROVISION_VERSION to indicate that folks should provision
|
# Bump the minor PROVISION_VERSION to indicate that folks should provision
|
||||||
# only when going from an old version of the code to a newer version. Bump
|
# only when going from an old version of the code to a newer version. Bump
|
||||||
|
|||||||
@@ -52,20 +52,20 @@
|
|||||||
"name": "fenced_quote",
|
"name": "fenced_quote",
|
||||||
"input": "Hamlet said:\n~~~ quote\nTo be or **not** to be.\n\nThat is the question\n~~~",
|
"input": "Hamlet said:\n~~~ quote\nTo be or **not** to be.\n\nThat is the question\n~~~",
|
||||||
"expected_output": "<p>Hamlet said:</p>\n<blockquote>\n<p>To be or <strong>not</strong> to be.</p>\n<p>That is the question</p>\n</blockquote>",
|
"expected_output": "<p>Hamlet said:</p>\n<blockquote>\n<p>To be or <strong>not</strong> to be.</p>\n<p>That is the question</p>\n</blockquote>",
|
||||||
"text_content": "Hamlet said:\n\nTo be or not to be.\nThat is the question\n"
|
"text_content": "Hamlet said:\n> To be or not to be.\n> That is the question\n"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "fenced_nested_quote",
|
"name": "fenced_nested_quote",
|
||||||
"input": "Hamlet said:\n~~~ quote\nPolonius said:\n> This above all: to thine ownself be true,\nAnd it must follow, as the night the day,\nThou canst not then be false to any man.\n\nWhat good advice!\n~~~",
|
"input": "Hamlet said:\n~~~ quote\nPolonius said:\n> This above all: to thine ownself be true,\nAnd it must follow, as the night the day,\nThou canst not then be false to any man.\n\nWhat good advice!\n~~~",
|
||||||
"expected_output": "<p>Hamlet said:</p>\n<blockquote>\n<p>Polonius said:</p>\n<blockquote>\n<p>This above all: to thine ownself be true,<br>\nAnd it must follow, as the night the day,<br>\nThou canst not then be false to any man.</p>\n</blockquote>\n<p>What good advice!</p>\n</blockquote>",
|
"expected_output": "<p>Hamlet said:</p>\n<blockquote>\n<p>Polonius said:</p>\n<blockquote>\n<p>This above all: to thine ownself be true,<br>\nAnd it must follow, as the night the day,<br>\nThou canst not then be false to any man.</p>\n</blockquote>\n<p>What good advice!</p>\n</blockquote>",
|
||||||
"text_content": "Hamlet said:\n\nPolonius said:\n\nThis above all: to thine ownself be true,\nAnd it must follow, as the night the day,\nThou canst not then be false to any man.\n\nWhat good advice!\n"
|
"text_content": "Hamlet said:\n> Polonius said:\n> > This above all: to thine ownself be true,\n> > And it must follow, as the night the day,\n> > Thou canst not then be false to any man.\n> What good advice!\n"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "complexly_nested_quote",
|
"name": "complexly_nested_quote",
|
||||||
"input": "I heard about this second hand...\n~~~ quote\n\nHe said:\n~~~ quote\nThe customer is complaining.\n\nThey looked at this code:\n``` \ndef hello(): print 'hello\n```\nThey would prefer:\n~~~\ndef hello()\n puts 'hello'\nend\n~~~\n\nPlease advise.\n~~~\n\nShe said:\n~~~ quote\nJust send them this:\n```\necho \"hello\n\"\n```\n~~~",
|
"input": "I heard about this second hand...\n~~~ quote\n\nHe said:\n~~~ quote\nThe customer is complaining.\n\nThey looked at this code:\n``` \ndef hello(): print 'hello\n```\nThey would prefer:\n~~~\ndef hello()\n puts 'hello'\nend\n~~~\n\nPlease advise.\n~~~\n\nShe said:\n~~~ quote\nJust send them this:\n```\necho \"hello\n\"\n```\n~~~",
|
||||||
"expected_output": "<p>I heard about this second hand...</p>\n<blockquote>\n<p>He said:</p>\n<blockquote>\n<p>The customer is complaining.</p>\n<p>They looked at this code:</p>\n<div class=\"codehilite\"><pre><span></span>def hello(): print 'hello\n</pre></div>\n\n\n<p>They would prefer:</p>\n</blockquote>\n<p>def hello()<br>\n puts 'hello'<br>\nend</p>\n</blockquote>\n<p>Please advise.</p>\n<div class=\"codehilite\"><pre><span></span>She said:\n~~~ quote\nJust send them this:\n```\necho "hello\n"\n```\n</pre></div>",
|
"expected_output": "<p>I heard about this second hand...</p>\n<blockquote>\n<p>He said:</p>\n<blockquote>\n<p>The customer is complaining.</p>\n<p>They looked at this code:</p>\n<div class=\"codehilite\"><pre><span></span>def hello(): print 'hello\n</pre></div>\n\n\n<p>They would prefer:</p>\n</blockquote>\n<p>def hello()<br>\n puts 'hello'<br>\nend</p>\n</blockquote>\n<p>Please advise.</p>\n<div class=\"codehilite\"><pre><span></span>She said:\n~~~ quote\nJust send them this:\n```\necho "hello\n"\n```\n</pre></div>",
|
||||||
"marked_expected_output": "<p>I heard about this second hand...</p>\n<blockquote>\n<p>He said:</p>\n<blockquote>\n<p>The customer is complaining.</p>\n<p>They looked at this code:</p>\n<div class=\"codehilite\"><pre><span></span>def hello(): print 'hello\n</pre></div>\n\n\n<p>They would prefer:</p>\n</blockquote>\n<p>def hello()<br>\n puts 'hello'<br>\nend</p>\n</blockquote>\n<p>Please advise.</p>\n<div class=\"codehilite\"><pre><span></span>\nShe said:\n~~~ quote\nJust send them this:\n```\necho "hello\n"\n```\n</pre></div>",
|
"marked_expected_output": "<p>I heard about this second hand...</p>\n<blockquote>\n<p>He said:</p>\n<blockquote>\n<p>The customer is complaining.</p>\n<p>They looked at this code:</p>\n<div class=\"codehilite\"><pre><span></span>def hello(): print 'hello\n</pre></div>\n\n\n<p>They would prefer:</p>\n</blockquote>\n<p>def hello()<br>\n puts 'hello'<br>\nend</p>\n</blockquote>\n<p>Please advise.</p>\n<div class=\"codehilite\"><pre><span></span>\nShe said:\n~~~ quote\nJust send them this:\n```\necho "hello\n"\n```\n</pre></div>",
|
||||||
"text_content": "I heard about this second hand...\n\nHe said:\n\nThe customer is complaining.\nThey looked at this code:\ndef hello(): print 'hello\n\n\n\nThey would prefer:\n\ndef hello()\n puts 'hello'\nend\n\nPlease advise.\nShe said:\n~~~ quote\nJust send them this:\n```\necho \"hello\n\"\n```\n"
|
"text_content": "I heard about this second hand...\n> He said:\n> > The customer is complaining.\n> > They looked at this code:\n> > def hello(): print 'hello\n> > They would prefer:\n> def hello()\n> puts 'hello'\n> end\n\nPlease advise.\nShe said:\n~~~ quote\nJust send them this:\n```\necho \"hello\n\"\n```\n"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "fenced_quotes_inside_mathblock",
|
"name": "fenced_quotes_inside_mathblock",
|
||||||
@@ -92,7 +92,7 @@
|
|||||||
"name": "fenced_quote_with_hashtag",
|
"name": "fenced_quote_with_hashtag",
|
||||||
"input": "```quote\n# line 1\n# line 2\n```",
|
"input": "```quote\n# line 1\n# line 2\n```",
|
||||||
"expected_output": "<blockquote>\n<p># line 1<br>\n# line 2</p>\n</blockquote>",
|
"expected_output": "<blockquote>\n<p># line 1<br>\n# line 2</p>\n</blockquote>",
|
||||||
"text_content": "\n# line 1\n# line 2\n"
|
"text_content": "> # line 1\n> # line 2\n"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "dangerous_block",
|
"name": "dangerous_block",
|
||||||
@@ -285,7 +285,7 @@
|
|||||||
"input": ">Google logo today:\n>https://www.google.com/images/srpr/logo4w.png\n>Kinda boring",
|
"input": ">Google logo today:\n>https://www.google.com/images/srpr/logo4w.png\n>Kinda boring",
|
||||||
"expected_output": "<blockquote>\n<p>Google logo today:<br>\n<a href=\"https://www.google.com/images/srpr/logo4w.png\" target=\"_blank\" title=\"https://www.google.com/images/srpr/logo4w.png\">https://www.google.com/images/srpr/logo4w.png</a><br>\nKinda boring</p>\n<div class=\"message_inline_image\"><a href=\"https://www.google.com/images/srpr/logo4w.png\" target=\"_blank\" title=\"https://www.google.com/images/srpr/logo4w.png\"><img src=\"https://www.google.com/images/srpr/logo4w.png\"></a></div></blockquote>",
|
"expected_output": "<blockquote>\n<p>Google logo today:<br>\n<a href=\"https://www.google.com/images/srpr/logo4w.png\" target=\"_blank\" title=\"https://www.google.com/images/srpr/logo4w.png\">https://www.google.com/images/srpr/logo4w.png</a><br>\nKinda boring</p>\n<div class=\"message_inline_image\"><a href=\"https://www.google.com/images/srpr/logo4w.png\" target=\"_blank\" title=\"https://www.google.com/images/srpr/logo4w.png\"><img src=\"https://www.google.com/images/srpr/logo4w.png\"></a></div></blockquote>",
|
||||||
"backend_only_rendering": true,
|
"backend_only_rendering": true,
|
||||||
"text_content": "\nGoogle logo today:\nhttps:\/\/www.google.com\/images\/srpr\/logo4w.png\nKinda boring\n"
|
"text_content": "> Google logo today:\n> https:\/\/www.google.com\/images\/srpr\/logo4w.png\n> Kinda boring\n"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"name": "two_inline_images",
|
"name": "two_inline_images",
|
||||||
|
|||||||
@@ -1,35 +0,0 @@
|
|||||||
[{
|
|
||||||
"message_retention_days": null,
|
|
||||||
"inline_image_preview": true,
|
|
||||||
"name_changes_disabled": false,
|
|
||||||
"icon_version": 1,
|
|
||||||
"waiting_period_threshold": 0,
|
|
||||||
"email_changes_disabled": false,
|
|
||||||
"deactivated": false,
|
|
||||||
"notifications_stream": null,
|
|
||||||
"restricted_to_domain": false,
|
|
||||||
"show_digest_email": true,
|
|
||||||
"allow_message_editing": true,
|
|
||||||
"description": "Organization imported from Slack!",
|
|
||||||
"default_language": "en",
|
|
||||||
"icon_source": "G",
|
|
||||||
"invite_required": false,
|
|
||||||
"invite_by_admins_only": false,
|
|
||||||
"create_stream_by_admins_only": false,
|
|
||||||
"mandatory_topics": false,
|
|
||||||
"inline_url_embed_preview": true,
|
|
||||||
"message_content_edit_limit_seconds": 600,
|
|
||||||
"authentication_methods": [
|
|
||||||
["Google", true],
|
|
||||||
["Email", true],
|
|
||||||
["GitHub", true],
|
|
||||||
["LDAP", true],
|
|
||||||
["Dev", true],
|
|
||||||
["RemoteUser", true]
|
|
||||||
],
|
|
||||||
"name": "",
|
|
||||||
"org_type": 1,
|
|
||||||
"add_emoji_by_admins_only": false,
|
|
||||||
"date_created": 0.0,
|
|
||||||
"id": 1
|
|
||||||
}]
|
|
||||||
@@ -55,7 +55,7 @@ def email_is_not_mit_mailing_list(email: Text) -> None:
|
|||||||
else:
|
else:
|
||||||
raise AssertionError("Unexpected DNS error")
|
raise AssertionError("Unexpected DNS error")
|
||||||
|
|
||||||
def check_subdomain_available(subdomain: str) -> None:
|
def check_subdomain_available(subdomain: str, from_management_command: bool=False) -> None:
|
||||||
error_strings = {
|
error_strings = {
|
||||||
'too short': _("Subdomain needs to have length 3 or greater."),
|
'too short': _("Subdomain needs to have length 3 or greater."),
|
||||||
'extremal dash': _("Subdomain cannot start or end with a '-'."),
|
'extremal dash': _("Subdomain cannot start or end with a '-'."),
|
||||||
@@ -70,6 +70,8 @@ def check_subdomain_available(subdomain: str) -> None:
|
|||||||
raise ValidationError(error_strings['extremal dash'])
|
raise ValidationError(error_strings['extremal dash'])
|
||||||
if not re.match('^[a-z0-9-]*$', subdomain):
|
if not re.match('^[a-z0-9-]*$', subdomain):
|
||||||
raise ValidationError(error_strings['bad character'])
|
raise ValidationError(error_strings['bad character'])
|
||||||
|
if from_management_command:
|
||||||
|
return
|
||||||
if len(subdomain) < 3:
|
if len(subdomain) < 3:
|
||||||
raise ValidationError(error_strings['too short'])
|
raise ValidationError(error_strings['too short'])
|
||||||
if is_reserved_subdomain(subdomain) or \
|
if is_reserved_subdomain(subdomain) or \
|
||||||
|
|||||||
@@ -314,7 +314,9 @@ def process_missed_message(to: Text, message: message.Message, pre_checked: bool
|
|||||||
send_to_missed_message_address(to, message)
|
send_to_missed_message_address(to, message)
|
||||||
|
|
||||||
def process_message(message: message.Message, rcpt_to: Optional[Text]=None, pre_checked: bool=False) -> None:
|
def process_message(message: message.Message, rcpt_to: Optional[Text]=None, pre_checked: bool=False) -> None:
|
||||||
subject_header = message.get("Subject", "(no subject)")
|
subject_header = str(message.get("Subject", "")).strip()
|
||||||
|
if subject_header == "":
|
||||||
|
subject_header = "(no topic)"
|
||||||
encoded_subject, encoding = decode_header(subject_header)[0]
|
encoded_subject, encoding = decode_header(subject_header)[0]
|
||||||
if encoding is None:
|
if encoding is None:
|
||||||
subject = force_text(encoded_subject) # encoded_subject has type str when encoding is None
|
subject = force_text(encoded_subject) # encoded_subject has type str when encoding is None
|
||||||
|
|||||||
@@ -1,31 +1,22 @@
|
|||||||
import datetime
|
import datetime
|
||||||
from boto.s3.key import Key
|
|
||||||
from boto.s3.connection import S3Connection
|
from boto.s3.connection import S3Connection
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
from django.db import connection
|
|
||||||
from django.forms.models import model_to_dict
|
from django.forms.models import model_to_dict
|
||||||
from django.utils.timezone import make_aware as timezone_make_aware
|
from django.utils.timezone import make_aware as timezone_make_aware
|
||||||
from django.utils.timezone import utc as timezone_utc
|
from django.utils.timezone import utc as timezone_utc
|
||||||
from django.utils.timezone import is_naive as timezone_is_naive
|
from django.utils.timezone import is_naive as timezone_is_naive
|
||||||
from django.db.models.query import QuerySet
|
|
||||||
import glob
|
import glob
|
||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
import ujson
|
import ujson
|
||||||
import shutil
|
|
||||||
import subprocess
|
import subprocess
|
||||||
import tempfile
|
import tempfile
|
||||||
from zerver.lib.upload import random_name, sanitize_name
|
from zerver.lib.avatar_hash import user_avatar_path_from_ids
|
||||||
from zerver.lib.avatar_hash import user_avatar_hash, user_avatar_path_from_ids
|
|
||||||
from zerver.lib.upload import S3UploadBackend, LocalUploadBackend
|
|
||||||
from zerver.lib.create_user import random_api_key
|
|
||||||
from zerver.lib.bulk_create import bulk_create_users
|
|
||||||
from zerver.models import UserProfile, Realm, Client, Huddle, Stream, \
|
from zerver.models import UserProfile, Realm, Client, Huddle, Stream, \
|
||||||
UserMessage, Subscription, Message, RealmEmoji, RealmFilter, \
|
UserMessage, Subscription, Message, RealmEmoji, RealmFilter, \
|
||||||
RealmDomain, Recipient, DefaultStream, get_user_profile_by_id, \
|
RealmDomain, Recipient, DefaultStream, get_user_profile_by_id, \
|
||||||
UserPresence, UserActivity, UserActivityInterval, Reaction, \
|
UserPresence, UserActivity, UserActivityInterval, \
|
||||||
CustomProfileField, CustomProfileFieldValue, \
|
get_display_recipient, Attachment, get_system_bot
|
||||||
get_display_recipient, Attachment, get_system_bot, email_to_username
|
|
||||||
from zerver.lib.parallel import run_parallel
|
from zerver.lib.parallel import run_parallel
|
||||||
from typing import Any, Callable, Dict, List, Optional, Set, Tuple, \
|
from typing import Any, Callable, Dict, List, Optional, Set, Tuple, \
|
||||||
Iterable, Text
|
Iterable, Text
|
||||||
@@ -1205,675 +1196,3 @@ def export_messages_single_user(user_profile: UserProfile, output_dir: Path, chu
|
|||||||
write_message_export(message_filename, output)
|
write_message_export(message_filename, output)
|
||||||
min_id = max(user_message_ids)
|
min_id = max(user_message_ids)
|
||||||
dump_file_id += 1
|
dump_file_id += 1
|
||||||
|
|
||||||
# Code from here is the realm import code path
|
|
||||||
|
|
||||||
# id_maps is a dictionary that maps table names to dictionaries
|
|
||||||
# that map old ids to new ids. We use this in
|
|
||||||
# re_map_foreign_keys and other places.
|
|
||||||
#
|
|
||||||
# We explicity initialize id_maps with the tables that support
|
|
||||||
# id re-mapping.
|
|
||||||
#
|
|
||||||
# Code reviewers: give these tables extra scrutiny, as we need to
|
|
||||||
# make sure to reload related tables AFTER we re-map the ids.
|
|
||||||
id_maps = {
|
|
||||||
'client': {},
|
|
||||||
'user_profile': {},
|
|
||||||
'realm': {},
|
|
||||||
'stream': {},
|
|
||||||
'recipient': {},
|
|
||||||
'subscription': {},
|
|
||||||
'defaultstream': {},
|
|
||||||
'reaction': {},
|
|
||||||
'realmemoji': {},
|
|
||||||
'realmdomain': {},
|
|
||||||
'realmfilter': {},
|
|
||||||
'message': {},
|
|
||||||
'user_presence': {},
|
|
||||||
'useractivity': {},
|
|
||||||
'useractivityinterval': {},
|
|
||||||
'usermessage': {},
|
|
||||||
'customprofilefield': {},
|
|
||||||
'customprofilefield_value': {},
|
|
||||||
'attachment': {},
|
|
||||||
} # type: Dict[str, Dict[int, int]]
|
|
||||||
|
|
||||||
path_maps = {
|
|
||||||
'attachment_path': {},
|
|
||||||
} # type: Dict[str, Dict[str, str]]
|
|
||||||
|
|
||||||
def update_id_map(table: TableName, old_id: int, new_id: int) -> None:
|
|
||||||
if table not in id_maps:
|
|
||||||
raise Exception('''
|
|
||||||
Table %s is not initialized in id_maps, which could
|
|
||||||
mean that we have not thought through circular
|
|
||||||
dependencies.
|
|
||||||
''' % (table,))
|
|
||||||
id_maps[table][old_id] = new_id
|
|
||||||
|
|
||||||
def fix_datetime_fields(data: TableData, table: TableName) -> None:
|
|
||||||
for item in data[table]:
|
|
||||||
for field_name in DATE_FIELDS[table]:
|
|
||||||
if item[field_name] is not None:
|
|
||||||
item[field_name] = datetime.datetime.fromtimestamp(item[field_name], tz=timezone_utc)
|
|
||||||
|
|
||||||
def fix_upload_links(data: TableData, message_table: TableName) -> None:
|
|
||||||
"""
|
|
||||||
Because the URLs for uploaded files encode the realm ID of the
|
|
||||||
organization being imported (which is only determined at import
|
|
||||||
time), we need to rewrite the URLs of links to uploaded files
|
|
||||||
during the import process.
|
|
||||||
"""
|
|
||||||
for message in data[message_table]:
|
|
||||||
if message['has_attachment'] is True:
|
|
||||||
for key, value in path_maps['attachment_path'].items():
|
|
||||||
if key in message['content']:
|
|
||||||
message['content'] = message['content'].replace(key, value)
|
|
||||||
if message['rendered_content']:
|
|
||||||
message['rendered_content'] = message['rendered_content'].replace(key, value)
|
|
||||||
|
|
||||||
def current_table_ids(data: TableData, table: TableName) -> List[int]:
|
|
||||||
"""
|
|
||||||
Returns the ids present in the current table
|
|
||||||
"""
|
|
||||||
id_list = []
|
|
||||||
for item in data[table]:
|
|
||||||
id_list.append(item["id"])
|
|
||||||
return id_list
|
|
||||||
|
|
||||||
def idseq(model_class: Any) -> str:
|
|
||||||
if model_class == RealmDomain:
|
|
||||||
return 'zerver_realmalias_id_seq'
|
|
||||||
return '{}_id_seq'.format(model_class._meta.db_table)
|
|
||||||
|
|
||||||
def allocate_ids(model_class: Any, count: int) -> List[int]:
|
|
||||||
"""
|
|
||||||
Increases the sequence number for a given table by the amount of objects being
|
|
||||||
imported into that table. Hence, this gives a reserved range of ids to import the
|
|
||||||
converted slack objects into the tables.
|
|
||||||
"""
|
|
||||||
conn = connection.cursor()
|
|
||||||
sequence = idseq(model_class)
|
|
||||||
conn.execute("select nextval('%s') from generate_series(1,%s)" %
|
|
||||||
(sequence, str(count)))
|
|
||||||
query = conn.fetchall() # Each element in the result is a tuple like (5,)
|
|
||||||
conn.close()
|
|
||||||
# convert List[Tuple[int]] to List[int]
|
|
||||||
return [item[0] for item in query]
|
|
||||||
|
|
||||||
def convert_to_id_fields(data: TableData, table: TableName, field_name: Field) -> None:
|
|
||||||
'''
|
|
||||||
When Django gives us dict objects via model_to_dict, the foreign
|
|
||||||
key fields are `foo`, but we want `foo_id` for the bulk insert.
|
|
||||||
This function handles the simple case where we simply rename
|
|
||||||
the fields. For cases where we need to munge ids in the
|
|
||||||
database, see re_map_foreign_keys.
|
|
||||||
'''
|
|
||||||
for item in data[table]:
|
|
||||||
item[field_name + "_id"] = item[field_name]
|
|
||||||
del item[field_name]
|
|
||||||
|
|
||||||
def re_map_foreign_keys(data: TableData,
|
|
||||||
table: TableName,
|
|
||||||
field_name: Field,
|
|
||||||
related_table: TableName,
|
|
||||||
verbose: bool=False,
|
|
||||||
id_field: bool=False,
|
|
||||||
recipient_field: bool=False) -> None:
|
|
||||||
"""
|
|
||||||
This is a wrapper function for all the realm data tables
|
|
||||||
and only avatar and attachment records need to be passed through the internal function
|
|
||||||
because of the difference in data format (TableData corresponding to realm data tables
|
|
||||||
and List[Record] corresponding to the avatar and attachment records)
|
|
||||||
"""
|
|
||||||
re_map_foreign_keys_internal(data[table], table, field_name, related_table, verbose, id_field,
|
|
||||||
recipient_field)
|
|
||||||
|
|
||||||
def re_map_foreign_keys_internal(data_table: List[Record],
|
|
||||||
table: TableName,
|
|
||||||
field_name: Field,
|
|
||||||
related_table: TableName,
|
|
||||||
verbose: bool=False,
|
|
||||||
id_field: bool=False,
|
|
||||||
recipient_field: bool=False) -> None:
|
|
||||||
'''
|
|
||||||
We occasionally need to assign new ids to rows during the
|
|
||||||
import/export process, to accommodate things like existing rows
|
|
||||||
already being in tables. See bulk_import_client for more context.
|
|
||||||
|
|
||||||
The tricky part is making sure that foreign key references
|
|
||||||
are in sync with the new ids, and this fixer function does
|
|
||||||
the re-mapping. (It also appends `_id` to the field.)
|
|
||||||
'''
|
|
||||||
lookup_table = id_maps[related_table]
|
|
||||||
for item in data_table:
|
|
||||||
if recipient_field:
|
|
||||||
if related_table == "stream" and item['type'] == 2:
|
|
||||||
pass
|
|
||||||
elif related_table == "user_profile" and item['type'] == 1:
|
|
||||||
pass
|
|
||||||
else:
|
|
||||||
continue
|
|
||||||
old_id = item[field_name]
|
|
||||||
if old_id in lookup_table:
|
|
||||||
new_id = lookup_table[old_id]
|
|
||||||
if verbose:
|
|
||||||
logging.info('Remapping %s %s from %s to %s' % (table,
|
|
||||||
field_name + '_id',
|
|
||||||
old_id,
|
|
||||||
new_id))
|
|
||||||
else:
|
|
||||||
new_id = old_id
|
|
||||||
if not id_field:
|
|
||||||
item[field_name + "_id"] = new_id
|
|
||||||
del item[field_name]
|
|
||||||
else:
|
|
||||||
item[field_name] = new_id
|
|
||||||
|
|
||||||
def fix_bitfield_keys(data: TableData, table: TableName, field_name: Field) -> None:
|
|
||||||
for item in data[table]:
|
|
||||||
item[field_name] = item[field_name + '_mask']
|
|
||||||
del item[field_name + '_mask']
|
|
||||||
|
|
||||||
def fix_realm_authentication_bitfield(data: TableData, table: TableName, field_name: Field) -> None:
|
|
||||||
"""Used to fixup the authentication_methods bitfield to be a string"""
|
|
||||||
for item in data[table]:
|
|
||||||
values_as_bitstring = ''.join(['1' if field[1] else '0' for field in
|
|
||||||
item[field_name]])
|
|
||||||
values_as_int = int(values_as_bitstring, 2)
|
|
||||||
item[field_name] = values_as_int
|
|
||||||
|
|
||||||
def update_model_ids(model: Any, data: TableData, table: TableName, related_table: TableName) -> None:
|
|
||||||
old_id_list = current_table_ids(data, table)
|
|
||||||
allocated_id_list = allocate_ids(model, len(data[table]))
|
|
||||||
for item in range(len(data[table])):
|
|
||||||
update_id_map(related_table, old_id_list[item], allocated_id_list[item])
|
|
||||||
re_map_foreign_keys(data, table, 'id', related_table=related_table, id_field=True)
|
|
||||||
|
|
||||||
def bulk_import_model(data: TableData, model: Any, table: TableName,
|
|
||||||
dump_file_id: Optional[str]=None) -> None:
|
|
||||||
# TODO, deprecate dump_file_id
|
|
||||||
model.objects.bulk_create(model(**item) for item in data[table])
|
|
||||||
if dump_file_id is None:
|
|
||||||
logging.info("Successfully imported %s from %s." % (model, table))
|
|
||||||
else:
|
|
||||||
logging.info("Successfully imported %s from %s[%s]." % (model, table, dump_file_id))
|
|
||||||
|
|
||||||
# Client is a table shared by multiple realms, so in order to
|
|
||||||
# correctly import multiple realms into the same server, we need to
|
|
||||||
# check if a Client object already exists, and so we need to support
|
|
||||||
# remap all Client IDs to the values in the new DB.
|
|
||||||
def bulk_import_client(data: TableData, model: Any, table: TableName) -> None:
|
|
||||||
for item in data[table]:
|
|
||||||
try:
|
|
||||||
client = Client.objects.get(name=item['name'])
|
|
||||||
except Client.DoesNotExist:
|
|
||||||
client = Client.objects.create(name=item['name'])
|
|
||||||
update_id_map(table='client', old_id=item['id'], new_id=client.id)
|
|
||||||
|
|
||||||
def import_uploads_local(import_dir: Path, processing_avatars: bool=False,
|
|
||||||
processing_emojis: bool=False) -> None:
|
|
||||||
records_filename = os.path.join(import_dir, "records.json")
|
|
||||||
with open(records_filename) as records_file:
|
|
||||||
records = ujson.loads(records_file.read())
|
|
||||||
|
|
||||||
re_map_foreign_keys_internal(records, 'records', 'realm_id', related_table="realm",
|
|
||||||
id_field=True)
|
|
||||||
if not processing_emojis:
|
|
||||||
re_map_foreign_keys_internal(records, 'records', 'user_profile_id',
|
|
||||||
related_table="user_profile", id_field=True)
|
|
||||||
for record in records:
|
|
||||||
if processing_avatars:
|
|
||||||
# For avatars, we need to rehash the user ID with the
|
|
||||||
# new server's avatar salt
|
|
||||||
avatar_path = user_avatar_path_from_ids(record['user_profile_id'], record['realm_id'])
|
|
||||||
file_path = os.path.join(settings.LOCAL_UPLOADS_DIR, "avatars", avatar_path)
|
|
||||||
if record['s3_path'].endswith('.original'):
|
|
||||||
file_path += '.original'
|
|
||||||
else:
|
|
||||||
file_path += '.png'
|
|
||||||
elif processing_emojis:
|
|
||||||
# For emojis we follow the function 'upload_emoji_image'
|
|
||||||
emoji_path = RealmEmoji.PATH_ID_TEMPLATE.format(
|
|
||||||
realm_id=record['realm_id'],
|
|
||||||
emoji_file_name=record['file_name'])
|
|
||||||
file_path = os.path.join(settings.LOCAL_UPLOADS_DIR, "avatars", emoji_path)
|
|
||||||
else:
|
|
||||||
# Should be kept in sync with its equivalent in zerver/lib/uploads in the
|
|
||||||
# function 'upload_message_image'
|
|
||||||
s3_file_name = "/".join([
|
|
||||||
str(record['realm_id']),
|
|
||||||
random_name(18),
|
|
||||||
sanitize_name(os.path.basename(record['path']))
|
|
||||||
])
|
|
||||||
file_path = os.path.join(settings.LOCAL_UPLOADS_DIR, "files", s3_file_name)
|
|
||||||
path_maps['attachment_path'][record['path']] = s3_file_name
|
|
||||||
|
|
||||||
orig_file_path = os.path.join(import_dir, record['path'])
|
|
||||||
if not os.path.exists(os.path.dirname(file_path)):
|
|
||||||
subprocess.check_call(["mkdir", "-p", os.path.dirname(file_path)])
|
|
||||||
shutil.copy(orig_file_path, file_path)
|
|
||||||
|
|
||||||
if processing_avatars:
|
|
||||||
# Ensure that we have medium-size avatar images for every
|
|
||||||
# avatar. TODO: This implementation is hacky, both in that it
|
|
||||||
# does get_user_profile_by_id for each user, and in that it
|
|
||||||
# might be better to require the export to just have these.
|
|
||||||
upload_backend = LocalUploadBackend()
|
|
||||||
for record in records:
|
|
||||||
if record['s3_path'].endswith('.original'):
|
|
||||||
user_profile = get_user_profile_by_id(record['user_profile_id'])
|
|
||||||
avatar_path = user_avatar_path_from_ids(user_profile.id, record['realm_id'])
|
|
||||||
medium_file_path = os.path.join(settings.LOCAL_UPLOADS_DIR, "avatars",
|
|
||||||
avatar_path) + '-medium.png'
|
|
||||||
if os.path.exists(medium_file_path):
|
|
||||||
# We remove the image here primarily to deal with
|
|
||||||
# issues when running the import script multiple
|
|
||||||
# times in development (where one might reuse the
|
|
||||||
# same realm ID from a previous iteration).
|
|
||||||
os.remove(medium_file_path)
|
|
||||||
upload_backend.ensure_medium_avatar_image(user_profile=user_profile)
|
|
||||||
|
|
||||||
def import_uploads_s3(bucket_name: str, import_dir: Path, processing_avatars: bool=False,
|
|
||||||
processing_emojis: bool=False) -> None:
|
|
||||||
upload_backend = S3UploadBackend()
|
|
||||||
conn = S3Connection(settings.S3_KEY, settings.S3_SECRET_KEY)
|
|
||||||
bucket = conn.get_bucket(bucket_name, validate=True)
|
|
||||||
|
|
||||||
records_filename = os.path.join(import_dir, "records.json")
|
|
||||||
with open(records_filename) as records_file:
|
|
||||||
records = ujson.loads(records_file.read())
|
|
||||||
|
|
||||||
re_map_foreign_keys_internal(records, 'records', 'realm_id', related_table="realm",
|
|
||||||
id_field=True)
|
|
||||||
re_map_foreign_keys_internal(records, 'records', 'user_profile_id',
|
|
||||||
related_table="user_profile", id_field=True)
|
|
||||||
for record in records:
|
|
||||||
key = Key(bucket)
|
|
||||||
|
|
||||||
if processing_avatars:
|
|
||||||
# For avatars, we need to rehash the user's email with the
|
|
||||||
# new server's avatar salt
|
|
||||||
avatar_path = user_avatar_path_from_ids(record['user_profile_id'], record['realm_id'])
|
|
||||||
key.key = avatar_path
|
|
||||||
if record['s3_path'].endswith('.original'):
|
|
||||||
key.key += '.original'
|
|
||||||
if processing_emojis:
|
|
||||||
# For emojis we follow the function 'upload_emoji_image'
|
|
||||||
emoji_path = RealmEmoji.PATH_ID_TEMPLATE.format(
|
|
||||||
realm_id=record['realm_id'],
|
|
||||||
emoji_file_name=record['file_name'])
|
|
||||||
key.key = emoji_path
|
|
||||||
else:
|
|
||||||
# Should be kept in sync with its equivalent in zerver/lib/uploads in the
|
|
||||||
# function 'upload_message_image'
|
|
||||||
s3_file_name = "/".join([
|
|
||||||
str(record['realm_id']),
|
|
||||||
random_name(18),
|
|
||||||
sanitize_name(os.path.basename(record['path']))
|
|
||||||
])
|
|
||||||
key.key = s3_file_name
|
|
||||||
path_maps['attachment_path'][record['path']] = s3_file_name
|
|
||||||
|
|
||||||
user_profile_id = int(record['user_profile_id'])
|
|
||||||
# Support email gateway bot and other cross-realm messages
|
|
||||||
if user_profile_id in id_maps["user_profile"]:
|
|
||||||
logging.info("Uploaded by ID mapped user: %s!" % (user_profile_id,))
|
|
||||||
user_profile_id = id_maps["user_profile"][user_profile_id]
|
|
||||||
user_profile = get_user_profile_by_id(user_profile_id)
|
|
||||||
key.set_metadata("user_profile_id", str(user_profile.id))
|
|
||||||
key.set_metadata("realm_id", str(user_profile.realm_id))
|
|
||||||
key.set_metadata("orig_last_modified", record['last_modified'])
|
|
||||||
|
|
||||||
headers = {'Content-Type': record['content_type']}
|
|
||||||
|
|
||||||
key.set_contents_from_filename(os.path.join(import_dir, record['path']), headers=headers)
|
|
||||||
|
|
||||||
if processing_avatars:
|
|
||||||
# TODO: Ideally, we'd do this in a separate pass, after
|
|
||||||
# all the avatars have been uploaded, since we may end up
|
|
||||||
# unnecssarily resizing images just before the medium-size
|
|
||||||
# image in the export is uploaded. See the local uplods
|
|
||||||
# code path for more notes.
|
|
||||||
upload_backend.ensure_medium_avatar_image(user_profile=user_profile)
|
|
||||||
|
|
||||||
def import_uploads(import_dir: Path, processing_avatars: bool=False,
|
|
||||||
processing_emojis: bool=False) -> None:
|
|
||||||
if processing_avatars:
|
|
||||||
logging.info("Importing avatars")
|
|
||||||
elif processing_emojis:
|
|
||||||
logging.info("Importing emojis")
|
|
||||||
else:
|
|
||||||
logging.info("Importing uploaded files")
|
|
||||||
if settings.LOCAL_UPLOADS_DIR:
|
|
||||||
import_uploads_local(import_dir, processing_avatars=processing_avatars,
|
|
||||||
processing_emojis=processing_emojis)
|
|
||||||
else:
|
|
||||||
if processing_avatars or processing_emojis:
|
|
||||||
bucket_name = settings.S3_AVATAR_BUCKET
|
|
||||||
else:
|
|
||||||
bucket_name = settings.S3_AUTH_UPLOADS_BUCKET
|
|
||||||
import_uploads_s3(bucket_name, import_dir, processing_avatars=processing_avatars)
|
|
||||||
|
|
||||||
# Importing data suffers from a difficult ordering problem because of
|
|
||||||
# models that reference each other circularly. Here is a correct order.
|
|
||||||
#
|
|
||||||
# * Client [no deps]
|
|
||||||
# * Realm [-notifications_stream]
|
|
||||||
# * Stream [only depends on realm]
|
|
||||||
# * Realm's notifications_stream
|
|
||||||
# * Now can do all realm_tables
|
|
||||||
# * UserProfile, in order by ID to avoid bot loop issues
|
|
||||||
# * Huddle
|
|
||||||
# * Recipient
|
|
||||||
# * Subscription
|
|
||||||
# * Message
|
|
||||||
# * UserMessage
|
|
||||||
#
|
|
||||||
# Because the Python object => JSON conversion process is not fully
|
|
||||||
# faithful, we have to use a set of fixers (e.g. on DateTime objects
|
|
||||||
# and Foreign Keys) to do the import correctly.
|
|
||||||
def do_import_realm(import_dir: Path, subdomain: str) -> Realm:
|
|
||||||
logging.info("Importing realm dump %s" % (import_dir,))
|
|
||||||
if not os.path.exists(import_dir):
|
|
||||||
raise Exception("Missing import directory!")
|
|
||||||
|
|
||||||
realm_data_filename = os.path.join(import_dir, "realm.json")
|
|
||||||
if not os.path.exists(realm_data_filename):
|
|
||||||
raise Exception("Missing realm.json file!")
|
|
||||||
|
|
||||||
logging.info("Importing realm data from %s" % (realm_data_filename,))
|
|
||||||
with open(realm_data_filename) as f:
|
|
||||||
data = ujson.load(f)
|
|
||||||
|
|
||||||
update_model_ids(Stream, data, 'zerver_stream', 'stream')
|
|
||||||
re_map_foreign_keys(data, 'zerver_realm', 'notifications_stream', related_table="stream")
|
|
||||||
|
|
||||||
fix_datetime_fields(data, 'zerver_realm')
|
|
||||||
# Fix realm subdomain information
|
|
||||||
data['zerver_realm'][0]['string_id'] = subdomain
|
|
||||||
data['zerver_realm'][0]['name'] = subdomain
|
|
||||||
fix_realm_authentication_bitfield(data, 'zerver_realm', 'authentication_methods')
|
|
||||||
update_model_ids(Realm, data, 'zerver_realm', 'realm')
|
|
||||||
|
|
||||||
realm = Realm(**data['zerver_realm'][0])
|
|
||||||
if realm.notifications_stream_id is not None:
|
|
||||||
notifications_stream_id = int(realm.notifications_stream_id) # type: Optional[int]
|
|
||||||
else:
|
|
||||||
notifications_stream_id = None
|
|
||||||
realm.notifications_stream_id = None
|
|
||||||
realm.save()
|
|
||||||
bulk_import_client(data, Client, 'zerver_client')
|
|
||||||
|
|
||||||
# Email tokens will automatically be randomly generated when the
|
|
||||||
# Stream objects are created by Django.
|
|
||||||
fix_datetime_fields(data, 'zerver_stream')
|
|
||||||
re_map_foreign_keys(data, 'zerver_stream', 'realm', related_table="realm")
|
|
||||||
bulk_import_model(data, Stream, 'zerver_stream')
|
|
||||||
|
|
||||||
realm.notifications_stream_id = notifications_stream_id
|
|
||||||
realm.save()
|
|
||||||
|
|
||||||
re_map_foreign_keys(data, 'zerver_defaultstream', 'stream', related_table="stream")
|
|
||||||
re_map_foreign_keys(data, 'zerver_realmemoji', 'author', related_table="user_profile")
|
|
||||||
for (table, model, related_table) in realm_tables:
|
|
||||||
re_map_foreign_keys(data, table, 'realm', related_table="realm")
|
|
||||||
update_model_ids(model, data, table, related_table)
|
|
||||||
bulk_import_model(data, model, table)
|
|
||||||
|
|
||||||
# Remap the user IDs for notification_bot and friends to their
|
|
||||||
# appropriate IDs on this server
|
|
||||||
for item in data['zerver_userprofile_crossrealm']:
|
|
||||||
logging.info("Adding to ID map: %s %s" % (item['id'], get_system_bot(item['email']).id))
|
|
||||||
new_user_id = get_system_bot(item['email']).id
|
|
||||||
update_id_map(table='user_profile', old_id=item['id'], new_id=new_user_id)
|
|
||||||
|
|
||||||
# Merge in zerver_userprofile_mirrordummy
|
|
||||||
data['zerver_userprofile'] = data['zerver_userprofile'] + data['zerver_userprofile_mirrordummy']
|
|
||||||
del data['zerver_userprofile_mirrordummy']
|
|
||||||
data['zerver_userprofile'].sort(key=lambda r: r['id'])
|
|
||||||
|
|
||||||
# To remap foreign key for UserProfile.last_active_message_id
|
|
||||||
update_message_foreign_keys(import_dir)
|
|
||||||
|
|
||||||
fix_datetime_fields(data, 'zerver_userprofile')
|
|
||||||
update_model_ids(UserProfile, data, 'zerver_userprofile', 'user_profile')
|
|
||||||
re_map_foreign_keys(data, 'zerver_userprofile', 'realm', related_table="realm")
|
|
||||||
re_map_foreign_keys(data, 'zerver_userprofile', 'bot_owner', related_table="user_profile")
|
|
||||||
re_map_foreign_keys(data, 'zerver_userprofile', 'default_sending_stream',
|
|
||||||
related_table="stream")
|
|
||||||
re_map_foreign_keys(data, 'zerver_userprofile', 'default_events_register_stream',
|
|
||||||
related_table="stream")
|
|
||||||
re_map_foreign_keys(data, 'zerver_userprofile', 'last_active_message_id',
|
|
||||||
related_table="message", id_field=True)
|
|
||||||
for user_profile_dict in data['zerver_userprofile']:
|
|
||||||
user_profile_dict['password'] = None
|
|
||||||
user_profile_dict['api_key'] = random_api_key()
|
|
||||||
# Since Zulip doesn't use these permissions, drop them
|
|
||||||
del user_profile_dict['user_permissions']
|
|
||||||
del user_profile_dict['groups']
|
|
||||||
|
|
||||||
user_profiles = [UserProfile(**item) for item in data['zerver_userprofile']]
|
|
||||||
for user_profile in user_profiles:
|
|
||||||
user_profile.set_unusable_password()
|
|
||||||
UserProfile.objects.bulk_create(user_profiles)
|
|
||||||
|
|
||||||
if 'zerver_huddle' in data:
|
|
||||||
bulk_import_model(data, Huddle, 'zerver_huddle')
|
|
||||||
|
|
||||||
re_map_foreign_keys(data, 'zerver_recipient', 'type_id', related_table="stream",
|
|
||||||
recipient_field=True, id_field=True)
|
|
||||||
re_map_foreign_keys(data, 'zerver_recipient', 'type_id', related_table="user_profile",
|
|
||||||
recipient_field=True, id_field=True)
|
|
||||||
update_model_ids(Recipient, data, 'zerver_recipient', 'recipient')
|
|
||||||
bulk_import_model(data, Recipient, 'zerver_recipient')
|
|
||||||
|
|
||||||
re_map_foreign_keys(data, 'zerver_subscription', 'user_profile', related_table="user_profile")
|
|
||||||
re_map_foreign_keys(data, 'zerver_subscription', 'recipient', related_table="recipient")
|
|
||||||
update_model_ids(Subscription, data, 'zerver_subscription', 'subscription')
|
|
||||||
bulk_import_model(data, Subscription, 'zerver_subscription')
|
|
||||||
|
|
||||||
fix_datetime_fields(data, 'zerver_userpresence')
|
|
||||||
re_map_foreign_keys(data, 'zerver_userpresence', 'user_profile', related_table="user_profile")
|
|
||||||
re_map_foreign_keys(data, 'zerver_userpresence', 'client', related_table='client')
|
|
||||||
update_model_ids(UserPresence, data, 'zerver_userpresence', 'user_presence')
|
|
||||||
bulk_import_model(data, UserPresence, 'zerver_userpresence')
|
|
||||||
|
|
||||||
fix_datetime_fields(data, 'zerver_useractivity')
|
|
||||||
re_map_foreign_keys(data, 'zerver_useractivity', 'user_profile', related_table="user_profile")
|
|
||||||
re_map_foreign_keys(data, 'zerver_useractivity', 'client', related_table='client')
|
|
||||||
update_model_ids(UserActivity, data, 'zerver_useractivity', 'useractivity')
|
|
||||||
bulk_import_model(data, UserActivity, 'zerver_useractivity')
|
|
||||||
|
|
||||||
fix_datetime_fields(data, 'zerver_useractivityinterval')
|
|
||||||
re_map_foreign_keys(data, 'zerver_useractivityinterval', 'user_profile', related_table="user_profile")
|
|
||||||
update_model_ids(UserActivityInterval, data, 'zerver_useractivityinterval',
|
|
||||||
'useractivityinterval')
|
|
||||||
bulk_import_model(data, UserActivityInterval, 'zerver_useractivityinterval')
|
|
||||||
|
|
||||||
if 'zerver_customprofilefield' in data:
|
|
||||||
# As the export of Custom Profile fields is not supported, Zulip exported
|
|
||||||
# data would not contain this field.
|
|
||||||
# However this is supported in slack importer script
|
|
||||||
re_map_foreign_keys(data, 'zerver_customprofilefield', 'realm', related_table="realm")
|
|
||||||
update_model_ids(CustomProfileField, data, 'zerver_customprofilefield',
|
|
||||||
related_table="customprofilefield")
|
|
||||||
bulk_import_model(data, CustomProfileField, 'zerver_customprofilefield')
|
|
||||||
|
|
||||||
re_map_foreign_keys(data, 'zerver_customprofilefield_value', 'user_profile',
|
|
||||||
related_table="user_profile")
|
|
||||||
re_map_foreign_keys(data, 'zerver_customprofilefield_value', 'field',
|
|
||||||
related_table="customprofilefield")
|
|
||||||
update_model_ids(CustomProfileFieldValue, data, 'zerver_customprofilefield_value',
|
|
||||||
related_table="customprofilefield_value")
|
|
||||||
bulk_import_model(data, CustomProfileFieldValue, 'zerver_customprofilefield_value')
|
|
||||||
|
|
||||||
# Import uploaded files and avatars
|
|
||||||
import_uploads(os.path.join(import_dir, "avatars"), processing_avatars=True)
|
|
||||||
import_uploads(os.path.join(import_dir, "uploads"))
|
|
||||||
|
|
||||||
# We need to have this check as the emoji files are only present in the data
|
|
||||||
# importer from slack
|
|
||||||
# For Zulip export, this doesn't exist
|
|
||||||
if os.path.exists(os.path.join(import_dir, "emoji")):
|
|
||||||
import_uploads(os.path.join(import_dir, "emoji"), processing_emojis=True)
|
|
||||||
|
|
||||||
# Import zerver_message and zerver_usermessage
|
|
||||||
import_message_data(import_dir)
|
|
||||||
|
|
||||||
# Do attachments AFTER message data is loaded.
|
|
||||||
# TODO: de-dup how we read these json files.
|
|
||||||
fn = os.path.join(import_dir, "attachment.json")
|
|
||||||
if not os.path.exists(fn):
|
|
||||||
raise Exception("Missing attachment.json file!")
|
|
||||||
|
|
||||||
logging.info("Importing attachment data from %s" % (fn,))
|
|
||||||
with open(fn) as f:
|
|
||||||
data = ujson.load(f)
|
|
||||||
|
|
||||||
import_attachments(data)
|
|
||||||
return realm
|
|
||||||
|
|
||||||
# create_users and do_import_system_bots differ from their equivalent in
|
|
||||||
# zerver/management/commands/initialize_voyager_db.py because here we check if the bots
|
|
||||||
# don't already exist and only then create a user for these bots.
|
|
||||||
def do_import_system_bots(realm: Any) -> None:
|
|
||||||
internal_bots = [(bot['name'], bot['email_template'] % (settings.INTERNAL_BOT_DOMAIN,))
|
|
||||||
for bot in settings.INTERNAL_BOTS]
|
|
||||||
create_users(realm, internal_bots, bot_type=UserProfile.DEFAULT_BOT)
|
|
||||||
names = [(settings.FEEDBACK_BOT_NAME, settings.FEEDBACK_BOT)]
|
|
||||||
create_users(realm, names, bot_type=UserProfile.DEFAULT_BOT)
|
|
||||||
print("Finished importing system bots.")
|
|
||||||
|
|
||||||
def create_users(realm: Realm, name_list: Iterable[Tuple[Text, Text]],
|
|
||||||
bot_type: Optional[int]=None) -> None:
|
|
||||||
user_set = set()
|
|
||||||
for full_name, email in name_list:
|
|
||||||
short_name = email_to_username(email)
|
|
||||||
if not UserProfile.objects.filter(email=email):
|
|
||||||
user_set.add((email, full_name, short_name, True))
|
|
||||||
bulk_create_users(realm, user_set, bot_type)
|
|
||||||
|
|
||||||
def update_message_foreign_keys(import_dir: Path) -> None:
|
|
||||||
dump_file_id = 1
|
|
||||||
while True:
|
|
||||||
message_filename = os.path.join(import_dir, "messages-%06d.json" % (dump_file_id,))
|
|
||||||
if not os.path.exists(message_filename):
|
|
||||||
break
|
|
||||||
|
|
||||||
with open(message_filename) as f:
|
|
||||||
data = ujson.load(f)
|
|
||||||
|
|
||||||
update_model_ids(Message, data, 'zerver_message', 'message')
|
|
||||||
dump_file_id += 1
|
|
||||||
|
|
||||||
def import_message_data(import_dir: Path) -> None:
|
|
||||||
dump_file_id = 1
|
|
||||||
while True:
|
|
||||||
message_filename = os.path.join(import_dir, "messages-%06d.json" % (dump_file_id,))
|
|
||||||
if not os.path.exists(message_filename):
|
|
||||||
break
|
|
||||||
|
|
||||||
with open(message_filename) as f:
|
|
||||||
data = ujson.load(f)
|
|
||||||
|
|
||||||
logging.info("Importing message dump %s" % (message_filename,))
|
|
||||||
re_map_foreign_keys(data, 'zerver_message', 'sender', related_table="user_profile")
|
|
||||||
re_map_foreign_keys(data, 'zerver_message', 'recipient', related_table="recipient")
|
|
||||||
re_map_foreign_keys(data, 'zerver_message', 'sending_client', related_table='client')
|
|
||||||
fix_datetime_fields(data, 'zerver_message')
|
|
||||||
# Parser to update message content with the updated attachment urls
|
|
||||||
fix_upload_links(data, 'zerver_message')
|
|
||||||
|
|
||||||
re_map_foreign_keys(data, 'zerver_message', 'id', related_table='message', id_field=True)
|
|
||||||
bulk_import_model(data, Message, 'zerver_message')
|
|
||||||
|
|
||||||
# Due to the structure of these message chunks, we're
|
|
||||||
# guaranteed to have already imported all the Message objects
|
|
||||||
# for this batch of UserMessage objects.
|
|
||||||
re_map_foreign_keys(data, 'zerver_usermessage', 'message', related_table="message")
|
|
||||||
re_map_foreign_keys(data, 'zerver_usermessage', 'user_profile', related_table="user_profile")
|
|
||||||
fix_bitfield_keys(data, 'zerver_usermessage', 'flags')
|
|
||||||
update_model_ids(UserMessage, data, 'zerver_usermessage', 'usermessage')
|
|
||||||
bulk_import_model(data, UserMessage, 'zerver_usermessage')
|
|
||||||
|
|
||||||
# As the export of Reactions is not supported, Zulip exported
|
|
||||||
# data would not contain this field.
|
|
||||||
# However this is supported in slack importer script
|
|
||||||
if 'zerver_reaction' in data:
|
|
||||||
re_map_foreign_keys(data, 'zerver_reaction', 'message', related_table="message")
|
|
||||||
re_map_foreign_keys(data, 'zerver_reaction', 'user_profile', related_table="user_profile")
|
|
||||||
for reaction in data['zerver_reaction']:
|
|
||||||
if reaction['reaction_type'] == Reaction.REALM_EMOJI:
|
|
||||||
re_map_foreign_keys(data, 'zerver_reaction', 'emoji_code',
|
|
||||||
related_table="realmemoji", id_field=True)
|
|
||||||
update_model_ids(Reaction, data, 'zerver_reaction', 'reaction')
|
|
||||||
bulk_import_model(data, Reaction, 'zerver_reaction')
|
|
||||||
|
|
||||||
dump_file_id += 1
|
|
||||||
|
|
||||||
def import_attachments(data: TableData) -> None:
|
|
||||||
|
|
||||||
# Clean up the data in zerver_attachment that is not
|
|
||||||
# relevant to our many-to-many import.
|
|
||||||
fix_datetime_fields(data, 'zerver_attachment')
|
|
||||||
re_map_foreign_keys(data, 'zerver_attachment', 'owner', related_table="user_profile")
|
|
||||||
re_map_foreign_keys(data, 'zerver_attachment', 'realm', related_table="realm")
|
|
||||||
|
|
||||||
# Configure ourselves. Django models many-to-many (m2m)
|
|
||||||
# relations asymmetrically. The parent here refers to the
|
|
||||||
# Model that has the ManyToManyField. It is assumed here
|
|
||||||
# the child models have been loaded, but we are in turn
|
|
||||||
# responsible for loading the parents and the m2m rows.
|
|
||||||
parent_model = Attachment
|
|
||||||
parent_db_table_name = 'zerver_attachment'
|
|
||||||
parent_singular = 'attachment'
|
|
||||||
child_singular = 'message'
|
|
||||||
child_plural = 'messages'
|
|
||||||
m2m_table_name = 'zerver_attachment_messages'
|
|
||||||
parent_id = 'attachment_id'
|
|
||||||
child_id = 'message_id'
|
|
||||||
|
|
||||||
update_model_ids(parent_model, data, parent_db_table_name, 'attachment')
|
|
||||||
# First, build our list of many-to-many (m2m) rows.
|
|
||||||
# We do this in a slightly convoluted way to anticipate
|
|
||||||
# a future where we may need to call re_map_foreign_keys.
|
|
||||||
|
|
||||||
m2m_rows = [] # type: List[Record]
|
|
||||||
for parent_row in data[parent_db_table_name]:
|
|
||||||
for fk_id in parent_row[child_plural]:
|
|
||||||
m2m_row = {} # type: Record
|
|
||||||
m2m_row[parent_singular] = parent_row['id']
|
|
||||||
m2m_row[child_singular] = id_maps['message'][fk_id]
|
|
||||||
m2m_rows.append(m2m_row)
|
|
||||||
|
|
||||||
# Create our table data for insert.
|
|
||||||
m2m_data = {m2m_table_name: m2m_rows} # type: TableData
|
|
||||||
convert_to_id_fields(m2m_data, m2m_table_name, parent_singular)
|
|
||||||
convert_to_id_fields(m2m_data, m2m_table_name, child_singular)
|
|
||||||
m2m_rows = m2m_data[m2m_table_name]
|
|
||||||
|
|
||||||
# Next, delete out our child data from the parent rows.
|
|
||||||
for parent_row in data[parent_db_table_name]:
|
|
||||||
del parent_row[child_plural]
|
|
||||||
|
|
||||||
# Update 'path_id' for the attachments
|
|
||||||
for attachment in data[parent_db_table_name]:
|
|
||||||
attachment['path_id'] = path_maps['attachment_path'][attachment['path_id']]
|
|
||||||
|
|
||||||
# Next, load the parent rows.
|
|
||||||
bulk_import_model(data, parent_model, parent_db_table_name)
|
|
||||||
|
|
||||||
# Now, go back to our m2m rows.
|
|
||||||
# TODO: Do this the kosher Django way. We may find a
|
|
||||||
# better way to do this in Django 1.9 particularly.
|
|
||||||
with connection.cursor() as cursor:
|
|
||||||
sql_template = '''
|
|
||||||
insert into %s (%s, %s) values(%%s, %%s);''' % (m2m_table_name,
|
|
||||||
parent_id,
|
|
||||||
child_id)
|
|
||||||
tups = [(row[parent_id], row[child_id]) for row in m2m_rows]
|
|
||||||
cursor.executemany(sql_template, tups)
|
|
||||||
|
|
||||||
logging.info('Successfully imported M2M table %s' % (m2m_table_name,))
|
|
||||||
|
|||||||
700
zerver/lib/import_realm.py
Normal file
700
zerver/lib/import_realm.py
Normal file
@@ -0,0 +1,700 @@
|
|||||||
|
import datetime
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import ujson
|
||||||
|
import shutil
|
||||||
|
import subprocess
|
||||||
|
|
||||||
|
from boto.s3.connection import S3Connection
|
||||||
|
from boto.s3.key import Key
|
||||||
|
from django.conf import settings
|
||||||
|
from django.db import connection
|
||||||
|
from django.utils.timezone import utc as timezone_utc
|
||||||
|
from typing import Any, Dict, List, Optional, Set, Tuple, \
|
||||||
|
Iterable, Text
|
||||||
|
|
||||||
|
from zerver.lib.avatar_hash import user_avatar_path_from_ids
|
||||||
|
from zerver.lib.bulk_create import bulk_create_users
|
||||||
|
from zerver.lib.create_user import random_api_key
|
||||||
|
from zerver.lib.export import DATE_FIELDS, realm_tables, \
|
||||||
|
Record, TableData, TableName, Field, Path
|
||||||
|
from zerver.lib.upload import random_name, sanitize_name, \
|
||||||
|
S3UploadBackend, LocalUploadBackend
|
||||||
|
from zerver.models import UserProfile, Realm, Client, Huddle, Stream, \
|
||||||
|
UserMessage, Subscription, Message, RealmEmoji, \
|
||||||
|
RealmDomain, Recipient, get_user_profile_by_id, \
|
||||||
|
UserPresence, UserActivity, UserActivityInterval, Reaction, \
|
||||||
|
CustomProfileField, CustomProfileFieldValue, \
|
||||||
|
Attachment, get_system_bot, email_to_username
|
||||||
|
|
||||||
|
# Code from here is the realm import code path
|
||||||
|
|
||||||
|
# id_maps is a dictionary that maps table names to dictionaries
|
||||||
|
# that map old ids to new ids. We use this in
|
||||||
|
# re_map_foreign_keys and other places.
|
||||||
|
#
|
||||||
|
# We explicity initialize id_maps with the tables that support
|
||||||
|
# id re-mapping.
|
||||||
|
#
|
||||||
|
# Code reviewers: give these tables extra scrutiny, as we need to
|
||||||
|
# make sure to reload related tables AFTER we re-map the ids.
|
||||||
|
id_maps = {
|
||||||
|
'client': {},
|
||||||
|
'user_profile': {},
|
||||||
|
'realm': {},
|
||||||
|
'stream': {},
|
||||||
|
'recipient': {},
|
||||||
|
'subscription': {},
|
||||||
|
'defaultstream': {},
|
||||||
|
'reaction': {},
|
||||||
|
'realmemoji': {},
|
||||||
|
'realmdomain': {},
|
||||||
|
'realmfilter': {},
|
||||||
|
'message': {},
|
||||||
|
'user_presence': {},
|
||||||
|
'useractivity': {},
|
||||||
|
'useractivityinterval': {},
|
||||||
|
'usermessage': {},
|
||||||
|
'customprofilefield': {},
|
||||||
|
'customprofilefield_value': {},
|
||||||
|
'attachment': {},
|
||||||
|
} # type: Dict[str, Dict[int, int]]
|
||||||
|
|
||||||
|
path_maps = {
|
||||||
|
'attachment_path': {},
|
||||||
|
} # type: Dict[str, Dict[str, str]]
|
||||||
|
|
||||||
|
def update_id_map(table: TableName, old_id: int, new_id: int) -> None:
|
||||||
|
if table not in id_maps:
|
||||||
|
raise Exception('''
|
||||||
|
Table %s is not initialized in id_maps, which could
|
||||||
|
mean that we have not thought through circular
|
||||||
|
dependencies.
|
||||||
|
''' % (table,))
|
||||||
|
id_maps[table][old_id] = new_id
|
||||||
|
|
||||||
|
def fix_datetime_fields(data: TableData, table: TableName) -> None:
|
||||||
|
for item in data[table]:
|
||||||
|
for field_name in DATE_FIELDS[table]:
|
||||||
|
if item[field_name] is not None:
|
||||||
|
item[field_name] = datetime.datetime.fromtimestamp(item[field_name], tz=timezone_utc)
|
||||||
|
|
||||||
|
def fix_upload_links(data: TableData, message_table: TableName) -> None:
|
||||||
|
"""
|
||||||
|
Because the URLs for uploaded files encode the realm ID of the
|
||||||
|
organization being imported (which is only determined at import
|
||||||
|
time), we need to rewrite the URLs of links to uploaded files
|
||||||
|
during the import process.
|
||||||
|
"""
|
||||||
|
for message in data[message_table]:
|
||||||
|
if message['has_attachment'] is True:
|
||||||
|
for key, value in path_maps['attachment_path'].items():
|
||||||
|
if key in message['content']:
|
||||||
|
message['content'] = message['content'].replace(key, value)
|
||||||
|
if message['rendered_content']:
|
||||||
|
message['rendered_content'] = message['rendered_content'].replace(key, value)
|
||||||
|
|
||||||
|
def current_table_ids(data: TableData, table: TableName) -> List[int]:
|
||||||
|
"""
|
||||||
|
Returns the ids present in the current table
|
||||||
|
"""
|
||||||
|
id_list = []
|
||||||
|
for item in data[table]:
|
||||||
|
id_list.append(item["id"])
|
||||||
|
return id_list
|
||||||
|
|
||||||
|
def idseq(model_class: Any) -> str:
|
||||||
|
if model_class == RealmDomain:
|
||||||
|
return 'zerver_realmalias_id_seq'
|
||||||
|
return '{}_id_seq'.format(model_class._meta.db_table)
|
||||||
|
|
||||||
|
def allocate_ids(model_class: Any, count: int) -> List[int]:
|
||||||
|
"""
|
||||||
|
Increases the sequence number for a given table by the amount of objects being
|
||||||
|
imported into that table. Hence, this gives a reserved range of ids to import the
|
||||||
|
converted slack objects into the tables.
|
||||||
|
"""
|
||||||
|
conn = connection.cursor()
|
||||||
|
sequence = idseq(model_class)
|
||||||
|
conn.execute("select nextval('%s') from generate_series(1,%s)" %
|
||||||
|
(sequence, str(count)))
|
||||||
|
query = conn.fetchall() # Each element in the result is a tuple like (5,)
|
||||||
|
conn.close()
|
||||||
|
# convert List[Tuple[int]] to List[int]
|
||||||
|
return [item[0] for item in query]
|
||||||
|
|
||||||
|
def convert_to_id_fields(data: TableData, table: TableName, field_name: Field) -> None:
|
||||||
|
'''
|
||||||
|
When Django gives us dict objects via model_to_dict, the foreign
|
||||||
|
key fields are `foo`, but we want `foo_id` for the bulk insert.
|
||||||
|
This function handles the simple case where we simply rename
|
||||||
|
the fields. For cases where we need to munge ids in the
|
||||||
|
database, see re_map_foreign_keys.
|
||||||
|
'''
|
||||||
|
for item in data[table]:
|
||||||
|
item[field_name + "_id"] = item[field_name]
|
||||||
|
del item[field_name]
|
||||||
|
|
||||||
|
def re_map_foreign_keys(data: TableData,
|
||||||
|
table: TableName,
|
||||||
|
field_name: Field,
|
||||||
|
related_table: TableName,
|
||||||
|
verbose: bool=False,
|
||||||
|
id_field: bool=False,
|
||||||
|
recipient_field: bool=False) -> None:
|
||||||
|
"""
|
||||||
|
This is a wrapper function for all the realm data tables
|
||||||
|
and only avatar and attachment records need to be passed through the internal function
|
||||||
|
because of the difference in data format (TableData corresponding to realm data tables
|
||||||
|
and List[Record] corresponding to the avatar and attachment records)
|
||||||
|
"""
|
||||||
|
re_map_foreign_keys_internal(data[table], table, field_name, related_table, verbose, id_field,
|
||||||
|
recipient_field)
|
||||||
|
|
||||||
|
def re_map_foreign_keys_internal(data_table: List[Record],
|
||||||
|
table: TableName,
|
||||||
|
field_name: Field,
|
||||||
|
related_table: TableName,
|
||||||
|
verbose: bool=False,
|
||||||
|
id_field: bool=False,
|
||||||
|
recipient_field: bool=False) -> None:
|
||||||
|
'''
|
||||||
|
We occasionally need to assign new ids to rows during the
|
||||||
|
import/export process, to accommodate things like existing rows
|
||||||
|
already being in tables. See bulk_import_client for more context.
|
||||||
|
|
||||||
|
The tricky part is making sure that foreign key references
|
||||||
|
are in sync with the new ids, and this fixer function does
|
||||||
|
the re-mapping. (It also appends `_id` to the field.)
|
||||||
|
'''
|
||||||
|
lookup_table = id_maps[related_table]
|
||||||
|
for item in data_table:
|
||||||
|
if recipient_field:
|
||||||
|
if related_table == "stream" and item['type'] == 2:
|
||||||
|
pass
|
||||||
|
elif related_table == "user_profile" and item['type'] == 1:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
continue
|
||||||
|
old_id = item[field_name]
|
||||||
|
if old_id in lookup_table:
|
||||||
|
new_id = lookup_table[old_id]
|
||||||
|
if verbose:
|
||||||
|
logging.info('Remapping %s %s from %s to %s' % (table,
|
||||||
|
field_name + '_id',
|
||||||
|
old_id,
|
||||||
|
new_id))
|
||||||
|
else:
|
||||||
|
new_id = old_id
|
||||||
|
if not id_field:
|
||||||
|
item[field_name + "_id"] = new_id
|
||||||
|
del item[field_name]
|
||||||
|
else:
|
||||||
|
item[field_name] = new_id
|
||||||
|
|
||||||
|
def fix_bitfield_keys(data: TableData, table: TableName, field_name: Field) -> None:
|
||||||
|
for item in data[table]:
|
||||||
|
item[field_name] = item[field_name + '_mask']
|
||||||
|
del item[field_name + '_mask']
|
||||||
|
|
||||||
|
def fix_realm_authentication_bitfield(data: TableData, table: TableName, field_name: Field) -> None:
|
||||||
|
"""Used to fixup the authentication_methods bitfield to be a string"""
|
||||||
|
for item in data[table]:
|
||||||
|
values_as_bitstring = ''.join(['1' if field[1] else '0' for field in
|
||||||
|
item[field_name]])
|
||||||
|
values_as_int = int(values_as_bitstring, 2)
|
||||||
|
item[field_name] = values_as_int
|
||||||
|
|
||||||
|
def update_model_ids(model: Any, data: TableData, table: TableName, related_table: TableName) -> None:
|
||||||
|
old_id_list = current_table_ids(data, table)
|
||||||
|
allocated_id_list = allocate_ids(model, len(data[table]))
|
||||||
|
for item in range(len(data[table])):
|
||||||
|
update_id_map(related_table, old_id_list[item], allocated_id_list[item])
|
||||||
|
re_map_foreign_keys(data, table, 'id', related_table=related_table, id_field=True)
|
||||||
|
|
||||||
|
def bulk_import_model(data: TableData, model: Any, table: TableName,
|
||||||
|
dump_file_id: Optional[str]=None) -> None:
|
||||||
|
# TODO, deprecate dump_file_id
|
||||||
|
model.objects.bulk_create(model(**item) for item in data[table])
|
||||||
|
if dump_file_id is None:
|
||||||
|
logging.info("Successfully imported %s from %s." % (model, table))
|
||||||
|
else:
|
||||||
|
logging.info("Successfully imported %s from %s[%s]." % (model, table, dump_file_id))
|
||||||
|
|
||||||
|
# Client is a table shared by multiple realms, so in order to
|
||||||
|
# correctly import multiple realms into the same server, we need to
|
||||||
|
# check if a Client object already exists, and so we need to support
|
||||||
|
# remap all Client IDs to the values in the new DB.
|
||||||
|
def bulk_import_client(data: TableData, model: Any, table: TableName) -> None:
|
||||||
|
for item in data[table]:
|
||||||
|
try:
|
||||||
|
client = Client.objects.get(name=item['name'])
|
||||||
|
except Client.DoesNotExist:
|
||||||
|
client = Client.objects.create(name=item['name'])
|
||||||
|
update_id_map(table='client', old_id=item['id'], new_id=client.id)
|
||||||
|
|
||||||
|
def import_uploads_local(import_dir: Path, processing_avatars: bool=False,
|
||||||
|
processing_emojis: bool=False) -> None:
|
||||||
|
records_filename = os.path.join(import_dir, "records.json")
|
||||||
|
with open(records_filename) as records_file:
|
||||||
|
records = ujson.loads(records_file.read())
|
||||||
|
|
||||||
|
re_map_foreign_keys_internal(records, 'records', 'realm_id', related_table="realm",
|
||||||
|
id_field=True)
|
||||||
|
if not processing_emojis:
|
||||||
|
re_map_foreign_keys_internal(records, 'records', 'user_profile_id',
|
||||||
|
related_table="user_profile", id_field=True)
|
||||||
|
for record in records:
|
||||||
|
if processing_avatars:
|
||||||
|
# For avatars, we need to rehash the user ID with the
|
||||||
|
# new server's avatar salt
|
||||||
|
avatar_path = user_avatar_path_from_ids(record['user_profile_id'], record['realm_id'])
|
||||||
|
file_path = os.path.join(settings.LOCAL_UPLOADS_DIR, "avatars", avatar_path)
|
||||||
|
if record['s3_path'].endswith('.original'):
|
||||||
|
file_path += '.original'
|
||||||
|
else:
|
||||||
|
file_path += '.png'
|
||||||
|
elif processing_emojis:
|
||||||
|
# For emojis we follow the function 'upload_emoji_image'
|
||||||
|
emoji_path = RealmEmoji.PATH_ID_TEMPLATE.format(
|
||||||
|
realm_id=record['realm_id'],
|
||||||
|
emoji_file_name=record['file_name'])
|
||||||
|
file_path = os.path.join(settings.LOCAL_UPLOADS_DIR, "avatars", emoji_path)
|
||||||
|
else:
|
||||||
|
# Should be kept in sync with its equivalent in zerver/lib/uploads in the
|
||||||
|
# function 'upload_message_image'
|
||||||
|
s3_file_name = "/".join([
|
||||||
|
str(record['realm_id']),
|
||||||
|
random_name(18),
|
||||||
|
sanitize_name(os.path.basename(record['path']))
|
||||||
|
])
|
||||||
|
file_path = os.path.join(settings.LOCAL_UPLOADS_DIR, "files", s3_file_name)
|
||||||
|
path_maps['attachment_path'][record['path']] = s3_file_name
|
||||||
|
|
||||||
|
orig_file_path = os.path.join(import_dir, record['path'])
|
||||||
|
if not os.path.exists(os.path.dirname(file_path)):
|
||||||
|
subprocess.check_call(["mkdir", "-p", os.path.dirname(file_path)])
|
||||||
|
shutil.copy(orig_file_path, file_path)
|
||||||
|
|
||||||
|
if processing_avatars:
|
||||||
|
# Ensure that we have medium-size avatar images for every
|
||||||
|
# avatar. TODO: This implementation is hacky, both in that it
|
||||||
|
# does get_user_profile_by_id for each user, and in that it
|
||||||
|
# might be better to require the export to just have these.
|
||||||
|
upload_backend = LocalUploadBackend()
|
||||||
|
for record in records:
|
||||||
|
if record['s3_path'].endswith('.original'):
|
||||||
|
user_profile = get_user_profile_by_id(record['user_profile_id'])
|
||||||
|
avatar_path = user_avatar_path_from_ids(user_profile.id, record['realm_id'])
|
||||||
|
medium_file_path = os.path.join(settings.LOCAL_UPLOADS_DIR, "avatars",
|
||||||
|
avatar_path) + '-medium.png'
|
||||||
|
if os.path.exists(medium_file_path):
|
||||||
|
# We remove the image here primarily to deal with
|
||||||
|
# issues when running the import script multiple
|
||||||
|
# times in development (where one might reuse the
|
||||||
|
# same realm ID from a previous iteration).
|
||||||
|
os.remove(medium_file_path)
|
||||||
|
upload_backend.ensure_medium_avatar_image(user_profile=user_profile)
|
||||||
|
|
||||||
|
def import_uploads_s3(bucket_name: str, import_dir: Path, processing_avatars: bool=False,
|
||||||
|
processing_emojis: bool=False) -> None:
|
||||||
|
upload_backend = S3UploadBackend()
|
||||||
|
conn = S3Connection(settings.S3_KEY, settings.S3_SECRET_KEY)
|
||||||
|
bucket = conn.get_bucket(bucket_name, validate=True)
|
||||||
|
|
||||||
|
records_filename = os.path.join(import_dir, "records.json")
|
||||||
|
with open(records_filename) as records_file:
|
||||||
|
records = ujson.loads(records_file.read())
|
||||||
|
|
||||||
|
re_map_foreign_keys_internal(records, 'records', 'realm_id', related_table="realm",
|
||||||
|
id_field=True)
|
||||||
|
re_map_foreign_keys_internal(records, 'records', 'user_profile_id',
|
||||||
|
related_table="user_profile", id_field=True)
|
||||||
|
for record in records:
|
||||||
|
key = Key(bucket)
|
||||||
|
|
||||||
|
if processing_avatars:
|
||||||
|
# For avatars, we need to rehash the user's email with the
|
||||||
|
# new server's avatar salt
|
||||||
|
avatar_path = user_avatar_path_from_ids(record['user_profile_id'], record['realm_id'])
|
||||||
|
key.key = avatar_path
|
||||||
|
if record['s3_path'].endswith('.original'):
|
||||||
|
key.key += '.original'
|
||||||
|
if processing_emojis:
|
||||||
|
# For emojis we follow the function 'upload_emoji_image'
|
||||||
|
emoji_path = RealmEmoji.PATH_ID_TEMPLATE.format(
|
||||||
|
realm_id=record['realm_id'],
|
||||||
|
emoji_file_name=record['file_name'])
|
||||||
|
key.key = emoji_path
|
||||||
|
else:
|
||||||
|
# Should be kept in sync with its equivalent in zerver/lib/uploads in the
|
||||||
|
# function 'upload_message_image'
|
||||||
|
s3_file_name = "/".join([
|
||||||
|
str(record['realm_id']),
|
||||||
|
random_name(18),
|
||||||
|
sanitize_name(os.path.basename(record['path']))
|
||||||
|
])
|
||||||
|
key.key = s3_file_name
|
||||||
|
path_maps['attachment_path'][record['path']] = s3_file_name
|
||||||
|
|
||||||
|
user_profile_id = int(record['user_profile_id'])
|
||||||
|
# Support email gateway bot and other cross-realm messages
|
||||||
|
if user_profile_id in id_maps["user_profile"]:
|
||||||
|
logging.info("Uploaded by ID mapped user: %s!" % (user_profile_id,))
|
||||||
|
user_profile_id = id_maps["user_profile"][user_profile_id]
|
||||||
|
user_profile = get_user_profile_by_id(user_profile_id)
|
||||||
|
key.set_metadata("user_profile_id", str(user_profile.id))
|
||||||
|
key.set_metadata("realm_id", str(user_profile.realm_id))
|
||||||
|
key.set_metadata("orig_last_modified", record['last_modified'])
|
||||||
|
|
||||||
|
headers = {'Content-Type': record['content_type']}
|
||||||
|
|
||||||
|
key.set_contents_from_filename(os.path.join(import_dir, record['path']), headers=headers)
|
||||||
|
|
||||||
|
if processing_avatars:
|
||||||
|
# TODO: Ideally, we'd do this in a separate pass, after
|
||||||
|
# all the avatars have been uploaded, since we may end up
|
||||||
|
# unnecssarily resizing images just before the medium-size
|
||||||
|
# image in the export is uploaded. See the local uplods
|
||||||
|
# code path for more notes.
|
||||||
|
upload_backend.ensure_medium_avatar_image(user_profile=user_profile)
|
||||||
|
|
||||||
|
def import_uploads(import_dir: Path, processing_avatars: bool=False,
|
||||||
|
processing_emojis: bool=False) -> None:
|
||||||
|
if processing_avatars:
|
||||||
|
logging.info("Importing avatars")
|
||||||
|
elif processing_emojis:
|
||||||
|
logging.info("Importing emojis")
|
||||||
|
else:
|
||||||
|
logging.info("Importing uploaded files")
|
||||||
|
if settings.LOCAL_UPLOADS_DIR:
|
||||||
|
import_uploads_local(import_dir, processing_avatars=processing_avatars,
|
||||||
|
processing_emojis=processing_emojis)
|
||||||
|
else:
|
||||||
|
if processing_avatars or processing_emojis:
|
||||||
|
bucket_name = settings.S3_AVATAR_BUCKET
|
||||||
|
else:
|
||||||
|
bucket_name = settings.S3_AUTH_UPLOADS_BUCKET
|
||||||
|
import_uploads_s3(bucket_name, import_dir, processing_avatars=processing_avatars)
|
||||||
|
|
||||||
|
# Importing data suffers from a difficult ordering problem because of
|
||||||
|
# models that reference each other circularly. Here is a correct order.
|
||||||
|
#
|
||||||
|
# * Client [no deps]
|
||||||
|
# * Realm [-notifications_stream]
|
||||||
|
# * Stream [only depends on realm]
|
||||||
|
# * Realm's notifications_stream
|
||||||
|
# * Now can do all realm_tables
|
||||||
|
# * UserProfile, in order by ID to avoid bot loop issues
|
||||||
|
# * Huddle
|
||||||
|
# * Recipient
|
||||||
|
# * Subscription
|
||||||
|
# * Message
|
||||||
|
# * UserMessage
|
||||||
|
#
|
||||||
|
# Because the Python object => JSON conversion process is not fully
|
||||||
|
# faithful, we have to use a set of fixers (e.g. on DateTime objects
|
||||||
|
# and Foreign Keys) to do the import correctly.
|
||||||
|
def do_import_realm(import_dir: Path, subdomain: str) -> Realm:
|
||||||
|
logging.info("Importing realm dump %s" % (import_dir,))
|
||||||
|
if not os.path.exists(import_dir):
|
||||||
|
raise Exception("Missing import directory!")
|
||||||
|
|
||||||
|
realm_data_filename = os.path.join(import_dir, "realm.json")
|
||||||
|
if not os.path.exists(realm_data_filename):
|
||||||
|
raise Exception("Missing realm.json file!")
|
||||||
|
|
||||||
|
logging.info("Importing realm data from %s" % (realm_data_filename,))
|
||||||
|
with open(realm_data_filename) as f:
|
||||||
|
data = ujson.load(f)
|
||||||
|
|
||||||
|
update_model_ids(Stream, data, 'zerver_stream', 'stream')
|
||||||
|
re_map_foreign_keys(data, 'zerver_realm', 'notifications_stream', related_table="stream")
|
||||||
|
|
||||||
|
fix_datetime_fields(data, 'zerver_realm')
|
||||||
|
# Fix realm subdomain information
|
||||||
|
data['zerver_realm'][0]['string_id'] = subdomain
|
||||||
|
data['zerver_realm'][0]['name'] = subdomain
|
||||||
|
fix_realm_authentication_bitfield(data, 'zerver_realm', 'authentication_methods')
|
||||||
|
update_model_ids(Realm, data, 'zerver_realm', 'realm')
|
||||||
|
|
||||||
|
realm = Realm(**data['zerver_realm'][0])
|
||||||
|
if realm.notifications_stream_id is not None:
|
||||||
|
notifications_stream_id = int(realm.notifications_stream_id) # type: Optional[int]
|
||||||
|
else:
|
||||||
|
notifications_stream_id = None
|
||||||
|
realm.notifications_stream_id = None
|
||||||
|
realm.save()
|
||||||
|
bulk_import_client(data, Client, 'zerver_client')
|
||||||
|
|
||||||
|
# Email tokens will automatically be randomly generated when the
|
||||||
|
# Stream objects are created by Django.
|
||||||
|
fix_datetime_fields(data, 'zerver_stream')
|
||||||
|
re_map_foreign_keys(data, 'zerver_stream', 'realm', related_table="realm")
|
||||||
|
bulk_import_model(data, Stream, 'zerver_stream')
|
||||||
|
|
||||||
|
realm.notifications_stream_id = notifications_stream_id
|
||||||
|
realm.save()
|
||||||
|
|
||||||
|
re_map_foreign_keys(data, 'zerver_defaultstream', 'stream', related_table="stream")
|
||||||
|
re_map_foreign_keys(data, 'zerver_realmemoji', 'author', related_table="user_profile")
|
||||||
|
for (table, model, related_table) in realm_tables:
|
||||||
|
re_map_foreign_keys(data, table, 'realm', related_table="realm")
|
||||||
|
update_model_ids(model, data, table, related_table)
|
||||||
|
bulk_import_model(data, model, table)
|
||||||
|
|
||||||
|
# Remap the user IDs for notification_bot and friends to their
|
||||||
|
# appropriate IDs on this server
|
||||||
|
for item in data['zerver_userprofile_crossrealm']:
|
||||||
|
logging.info("Adding to ID map: %s %s" % (item['id'], get_system_bot(item['email']).id))
|
||||||
|
new_user_id = get_system_bot(item['email']).id
|
||||||
|
update_id_map(table='user_profile', old_id=item['id'], new_id=new_user_id)
|
||||||
|
|
||||||
|
# Merge in zerver_userprofile_mirrordummy
|
||||||
|
data['zerver_userprofile'] = data['zerver_userprofile'] + data['zerver_userprofile_mirrordummy']
|
||||||
|
del data['zerver_userprofile_mirrordummy']
|
||||||
|
data['zerver_userprofile'].sort(key=lambda r: r['id'])
|
||||||
|
|
||||||
|
# To remap foreign key for UserProfile.last_active_message_id
|
||||||
|
update_message_foreign_keys(import_dir)
|
||||||
|
|
||||||
|
fix_datetime_fields(data, 'zerver_userprofile')
|
||||||
|
update_model_ids(UserProfile, data, 'zerver_userprofile', 'user_profile')
|
||||||
|
re_map_foreign_keys(data, 'zerver_userprofile', 'realm', related_table="realm")
|
||||||
|
re_map_foreign_keys(data, 'zerver_userprofile', 'bot_owner', related_table="user_profile")
|
||||||
|
re_map_foreign_keys(data, 'zerver_userprofile', 'default_sending_stream',
|
||||||
|
related_table="stream")
|
||||||
|
re_map_foreign_keys(data, 'zerver_userprofile', 'default_events_register_stream',
|
||||||
|
related_table="stream")
|
||||||
|
re_map_foreign_keys(data, 'zerver_userprofile', 'last_active_message_id',
|
||||||
|
related_table="message", id_field=True)
|
||||||
|
for user_profile_dict in data['zerver_userprofile']:
|
||||||
|
user_profile_dict['password'] = None
|
||||||
|
user_profile_dict['api_key'] = random_api_key()
|
||||||
|
# Since Zulip doesn't use these permissions, drop them
|
||||||
|
del user_profile_dict['user_permissions']
|
||||||
|
del user_profile_dict['groups']
|
||||||
|
|
||||||
|
user_profiles = [UserProfile(**item) for item in data['zerver_userprofile']]
|
||||||
|
for user_profile in user_profiles:
|
||||||
|
user_profile.set_unusable_password()
|
||||||
|
UserProfile.objects.bulk_create(user_profiles)
|
||||||
|
|
||||||
|
if 'zerver_huddle' in data:
|
||||||
|
bulk_import_model(data, Huddle, 'zerver_huddle')
|
||||||
|
|
||||||
|
re_map_foreign_keys(data, 'zerver_recipient', 'type_id', related_table="stream",
|
||||||
|
recipient_field=True, id_field=True)
|
||||||
|
re_map_foreign_keys(data, 'zerver_recipient', 'type_id', related_table="user_profile",
|
||||||
|
recipient_field=True, id_field=True)
|
||||||
|
update_model_ids(Recipient, data, 'zerver_recipient', 'recipient')
|
||||||
|
bulk_import_model(data, Recipient, 'zerver_recipient')
|
||||||
|
|
||||||
|
re_map_foreign_keys(data, 'zerver_subscription', 'user_profile', related_table="user_profile")
|
||||||
|
re_map_foreign_keys(data, 'zerver_subscription', 'recipient', related_table="recipient")
|
||||||
|
update_model_ids(Subscription, data, 'zerver_subscription', 'subscription')
|
||||||
|
bulk_import_model(data, Subscription, 'zerver_subscription')
|
||||||
|
|
||||||
|
fix_datetime_fields(data, 'zerver_userpresence')
|
||||||
|
re_map_foreign_keys(data, 'zerver_userpresence', 'user_profile', related_table="user_profile")
|
||||||
|
re_map_foreign_keys(data, 'zerver_userpresence', 'client', related_table='client')
|
||||||
|
update_model_ids(UserPresence, data, 'zerver_userpresence', 'user_presence')
|
||||||
|
bulk_import_model(data, UserPresence, 'zerver_userpresence')
|
||||||
|
|
||||||
|
fix_datetime_fields(data, 'zerver_useractivity')
|
||||||
|
re_map_foreign_keys(data, 'zerver_useractivity', 'user_profile', related_table="user_profile")
|
||||||
|
re_map_foreign_keys(data, 'zerver_useractivity', 'client', related_table='client')
|
||||||
|
update_model_ids(UserActivity, data, 'zerver_useractivity', 'useractivity')
|
||||||
|
bulk_import_model(data, UserActivity, 'zerver_useractivity')
|
||||||
|
|
||||||
|
fix_datetime_fields(data, 'zerver_useractivityinterval')
|
||||||
|
re_map_foreign_keys(data, 'zerver_useractivityinterval', 'user_profile', related_table="user_profile")
|
||||||
|
update_model_ids(UserActivityInterval, data, 'zerver_useractivityinterval',
|
||||||
|
'useractivityinterval')
|
||||||
|
bulk_import_model(data, UserActivityInterval, 'zerver_useractivityinterval')
|
||||||
|
|
||||||
|
if 'zerver_customprofilefield' in data:
|
||||||
|
# As the export of Custom Profile fields is not supported, Zulip exported
|
||||||
|
# data would not contain this field.
|
||||||
|
# However this is supported in slack importer script
|
||||||
|
re_map_foreign_keys(data, 'zerver_customprofilefield', 'realm', related_table="realm")
|
||||||
|
update_model_ids(CustomProfileField, data, 'zerver_customprofilefield',
|
||||||
|
related_table="customprofilefield")
|
||||||
|
bulk_import_model(data, CustomProfileField, 'zerver_customprofilefield')
|
||||||
|
|
||||||
|
re_map_foreign_keys(data, 'zerver_customprofilefield_value', 'user_profile',
|
||||||
|
related_table="user_profile")
|
||||||
|
re_map_foreign_keys(data, 'zerver_customprofilefield_value', 'field',
|
||||||
|
related_table="customprofilefield")
|
||||||
|
update_model_ids(CustomProfileFieldValue, data, 'zerver_customprofilefield_value',
|
||||||
|
related_table="customprofilefield_value")
|
||||||
|
bulk_import_model(data, CustomProfileFieldValue, 'zerver_customprofilefield_value')
|
||||||
|
|
||||||
|
# Import uploaded files and avatars
|
||||||
|
import_uploads(os.path.join(import_dir, "avatars"), processing_avatars=True)
|
||||||
|
import_uploads(os.path.join(import_dir, "uploads"))
|
||||||
|
|
||||||
|
# We need to have this check as the emoji files are only present in the data
|
||||||
|
# importer from slack
|
||||||
|
# For Zulip export, this doesn't exist
|
||||||
|
if os.path.exists(os.path.join(import_dir, "emoji")):
|
||||||
|
import_uploads(os.path.join(import_dir, "emoji"), processing_emojis=True)
|
||||||
|
|
||||||
|
# Import zerver_message and zerver_usermessage
|
||||||
|
import_message_data(import_dir)
|
||||||
|
|
||||||
|
# Do attachments AFTER message data is loaded.
|
||||||
|
# TODO: de-dup how we read these json files.
|
||||||
|
fn = os.path.join(import_dir, "attachment.json")
|
||||||
|
if not os.path.exists(fn):
|
||||||
|
raise Exception("Missing attachment.json file!")
|
||||||
|
|
||||||
|
logging.info("Importing attachment data from %s" % (fn,))
|
||||||
|
with open(fn) as f:
|
||||||
|
data = ujson.load(f)
|
||||||
|
|
||||||
|
import_attachments(data)
|
||||||
|
return realm
|
||||||
|
|
||||||
|
# create_users and do_import_system_bots differ from their equivalent in
|
||||||
|
# zerver/management/commands/initialize_voyager_db.py because here we check if the bots
|
||||||
|
# don't already exist and only then create a user for these bots.
|
||||||
|
def do_import_system_bots(realm: Any) -> None:
|
||||||
|
internal_bots = [(bot['name'], bot['email_template'] % (settings.INTERNAL_BOT_DOMAIN,))
|
||||||
|
for bot in settings.INTERNAL_BOTS]
|
||||||
|
create_users(realm, internal_bots, bot_type=UserProfile.DEFAULT_BOT)
|
||||||
|
names = [(settings.FEEDBACK_BOT_NAME, settings.FEEDBACK_BOT)]
|
||||||
|
create_users(realm, names, bot_type=UserProfile.DEFAULT_BOT)
|
||||||
|
print("Finished importing system bots.")
|
||||||
|
|
||||||
|
def create_users(realm: Realm, name_list: Iterable[Tuple[Text, Text]],
|
||||||
|
bot_type: Optional[int]=None) -> None:
|
||||||
|
user_set = set()
|
||||||
|
for full_name, email in name_list:
|
||||||
|
short_name = email_to_username(email)
|
||||||
|
if not UserProfile.objects.filter(email=email):
|
||||||
|
user_set.add((email, full_name, short_name, True))
|
||||||
|
bulk_create_users(realm, user_set, bot_type)
|
||||||
|
|
||||||
|
def update_message_foreign_keys(import_dir: Path) -> None:
|
||||||
|
dump_file_id = 1
|
||||||
|
while True:
|
||||||
|
message_filename = os.path.join(import_dir, "messages-%06d.json" % (dump_file_id,))
|
||||||
|
if not os.path.exists(message_filename):
|
||||||
|
break
|
||||||
|
|
||||||
|
with open(message_filename) as f:
|
||||||
|
data = ujson.load(f)
|
||||||
|
|
||||||
|
update_model_ids(Message, data, 'zerver_message', 'message')
|
||||||
|
dump_file_id += 1
|
||||||
|
|
||||||
|
def import_message_data(import_dir: Path) -> None:
|
||||||
|
dump_file_id = 1
|
||||||
|
while True:
|
||||||
|
message_filename = os.path.join(import_dir, "messages-%06d.json" % (dump_file_id,))
|
||||||
|
if not os.path.exists(message_filename):
|
||||||
|
break
|
||||||
|
|
||||||
|
with open(message_filename) as f:
|
||||||
|
data = ujson.load(f)
|
||||||
|
|
||||||
|
logging.info("Importing message dump %s" % (message_filename,))
|
||||||
|
re_map_foreign_keys(data, 'zerver_message', 'sender', related_table="user_profile")
|
||||||
|
re_map_foreign_keys(data, 'zerver_message', 'recipient', related_table="recipient")
|
||||||
|
re_map_foreign_keys(data, 'zerver_message', 'sending_client', related_table='client')
|
||||||
|
fix_datetime_fields(data, 'zerver_message')
|
||||||
|
# Parser to update message content with the updated attachment urls
|
||||||
|
fix_upload_links(data, 'zerver_message')
|
||||||
|
|
||||||
|
re_map_foreign_keys(data, 'zerver_message', 'id', related_table='message', id_field=True)
|
||||||
|
bulk_import_model(data, Message, 'zerver_message')
|
||||||
|
|
||||||
|
# Due to the structure of these message chunks, we're
|
||||||
|
# guaranteed to have already imported all the Message objects
|
||||||
|
# for this batch of UserMessage objects.
|
||||||
|
re_map_foreign_keys(data, 'zerver_usermessage', 'message', related_table="message")
|
||||||
|
re_map_foreign_keys(data, 'zerver_usermessage', 'user_profile', related_table="user_profile")
|
||||||
|
fix_bitfield_keys(data, 'zerver_usermessage', 'flags')
|
||||||
|
update_model_ids(UserMessage, data, 'zerver_usermessage', 'usermessage')
|
||||||
|
bulk_import_model(data, UserMessage, 'zerver_usermessage')
|
||||||
|
|
||||||
|
# As the export of Reactions is not supported, Zulip exported
|
||||||
|
# data would not contain this field.
|
||||||
|
# However this is supported in slack importer script
|
||||||
|
if 'zerver_reaction' in data:
|
||||||
|
re_map_foreign_keys(data, 'zerver_reaction', 'message', related_table="message")
|
||||||
|
re_map_foreign_keys(data, 'zerver_reaction', 'user_profile', related_table="user_profile")
|
||||||
|
for reaction in data['zerver_reaction']:
|
||||||
|
if reaction['reaction_type'] == Reaction.REALM_EMOJI:
|
||||||
|
re_map_foreign_keys(data, 'zerver_reaction', 'emoji_code',
|
||||||
|
related_table="realmemoji", id_field=True)
|
||||||
|
update_model_ids(Reaction, data, 'zerver_reaction', 'reaction')
|
||||||
|
bulk_import_model(data, Reaction, 'zerver_reaction')
|
||||||
|
|
||||||
|
dump_file_id += 1
|
||||||
|
|
||||||
|
def import_attachments(data: TableData) -> None:
|
||||||
|
|
||||||
|
# Clean up the data in zerver_attachment that is not
|
||||||
|
# relevant to our many-to-many import.
|
||||||
|
fix_datetime_fields(data, 'zerver_attachment')
|
||||||
|
re_map_foreign_keys(data, 'zerver_attachment', 'owner', related_table="user_profile")
|
||||||
|
re_map_foreign_keys(data, 'zerver_attachment', 'realm', related_table="realm")
|
||||||
|
|
||||||
|
# Configure ourselves. Django models many-to-many (m2m)
|
||||||
|
# relations asymmetrically. The parent here refers to the
|
||||||
|
# Model that has the ManyToManyField. It is assumed here
|
||||||
|
# the child models have been loaded, but we are in turn
|
||||||
|
# responsible for loading the parents and the m2m rows.
|
||||||
|
parent_model = Attachment
|
||||||
|
parent_db_table_name = 'zerver_attachment'
|
||||||
|
parent_singular = 'attachment'
|
||||||
|
child_singular = 'message'
|
||||||
|
child_plural = 'messages'
|
||||||
|
m2m_table_name = 'zerver_attachment_messages'
|
||||||
|
parent_id = 'attachment_id'
|
||||||
|
child_id = 'message_id'
|
||||||
|
|
||||||
|
update_model_ids(parent_model, data, parent_db_table_name, 'attachment')
|
||||||
|
# First, build our list of many-to-many (m2m) rows.
|
||||||
|
# We do this in a slightly convoluted way to anticipate
|
||||||
|
# a future where we may need to call re_map_foreign_keys.
|
||||||
|
|
||||||
|
m2m_rows = [] # type: List[Record]
|
||||||
|
for parent_row in data[parent_db_table_name]:
|
||||||
|
for fk_id in parent_row[child_plural]:
|
||||||
|
m2m_row = {} # type: Record
|
||||||
|
m2m_row[parent_singular] = parent_row['id']
|
||||||
|
m2m_row[child_singular] = id_maps['message'][fk_id]
|
||||||
|
m2m_rows.append(m2m_row)
|
||||||
|
|
||||||
|
# Create our table data for insert.
|
||||||
|
m2m_data = {m2m_table_name: m2m_rows} # type: TableData
|
||||||
|
convert_to_id_fields(m2m_data, m2m_table_name, parent_singular)
|
||||||
|
convert_to_id_fields(m2m_data, m2m_table_name, child_singular)
|
||||||
|
m2m_rows = m2m_data[m2m_table_name]
|
||||||
|
|
||||||
|
# Next, delete out our child data from the parent rows.
|
||||||
|
for parent_row in data[parent_db_table_name]:
|
||||||
|
del parent_row[child_plural]
|
||||||
|
|
||||||
|
# Update 'path_id' for the attachments
|
||||||
|
for attachment in data[parent_db_table_name]:
|
||||||
|
attachment['path_id'] = path_maps['attachment_path'][attachment['path_id']]
|
||||||
|
|
||||||
|
# Next, load the parent rows.
|
||||||
|
bulk_import_model(data, parent_model, parent_db_table_name)
|
||||||
|
|
||||||
|
# Now, go back to our m2m rows.
|
||||||
|
# TODO: Do this the kosher Django way. We may find a
|
||||||
|
# better way to do this in Django 1.9 particularly.
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
sql_template = '''
|
||||||
|
insert into %s (%s, %s) values(%%s, %%s);''' % (m2m_table_name,
|
||||||
|
parent_id,
|
||||||
|
child_id)
|
||||||
|
tups = [(row[parent_id], row[child_id]) for row in m2m_rows]
|
||||||
|
cursor.executemany(sql_template, tups)
|
||||||
|
|
||||||
|
logging.info('Successfully imported M2M table %s' % (m2m_table_name,))
|
||||||
@@ -3,6 +3,7 @@
|
|||||||
import sys
|
import sys
|
||||||
|
|
||||||
from argparse import ArgumentParser
|
from argparse import ArgumentParser
|
||||||
|
from django.conf import settings
|
||||||
from django.core.exceptions import MultipleObjectsReturned
|
from django.core.exceptions import MultipleObjectsReturned
|
||||||
from django.core.management.base import BaseCommand, CommandError
|
from django.core.management.base import BaseCommand, CommandError
|
||||||
from typing import Any, Dict, Optional, Text, List
|
from typing import Any, Dict, Optional, Text, List
|
||||||
@@ -16,6 +17,16 @@ def is_integer_string(val: str) -> bool:
|
|||||||
except ValueError:
|
except ValueError:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
def check_config() -> None:
|
||||||
|
for (setting_name, default) in settings.REQUIRED_SETTINGS:
|
||||||
|
try:
|
||||||
|
if settings.__getattr__(setting_name) != default:
|
||||||
|
continue
|
||||||
|
except AttributeError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
raise CommandError("Error: You must set %s in /etc/zulip/settings.py." % (setting_name,))
|
||||||
|
|
||||||
class ZulipBaseCommand(BaseCommand):
|
class ZulipBaseCommand(BaseCommand):
|
||||||
def add_realm_args(self, parser: ArgumentParser, required: bool=False,
|
def add_realm_args(self, parser: ArgumentParser, required: bool=False,
|
||||||
help: Optional[str]=None) -> None:
|
help: Optional[str]=None) -> None:
|
||||||
|
|||||||
@@ -445,13 +445,24 @@ def get_mobile_push_content(rendered_content: Text) -> Text:
|
|||||||
# Handles realm emojis, avatars etc.
|
# Handles realm emojis, avatars etc.
|
||||||
if elem.tag == "img":
|
if elem.tag == "img":
|
||||||
return elem.get("alt", "")
|
return elem.get("alt", "")
|
||||||
|
if elem.tag == 'blockquote':
|
||||||
|
return '' # To avoid empty line before quote text
|
||||||
|
return elem.text or ''
|
||||||
|
|
||||||
return elem.text or ""
|
def format_as_quote(quote_text: Text) -> Text:
|
||||||
|
quote_text_list = filter(None, quote_text.split('\n')) # Remove empty lines
|
||||||
|
quote_text = '\n'.join(map(lambda x: "> "+x, quote_text_list))
|
||||||
|
quote_text += '\n'
|
||||||
|
return quote_text
|
||||||
|
|
||||||
def process(elem: LH.HtmlElement) -> Text:
|
def process(elem: LH.HtmlElement) -> Text:
|
||||||
plain_text = get_text(elem)
|
plain_text = get_text(elem)
|
||||||
|
sub_text = ''
|
||||||
for child in elem:
|
for child in elem:
|
||||||
plain_text += process(child)
|
sub_text += process(child)
|
||||||
|
if elem.tag == 'blockquote':
|
||||||
|
sub_text = format_as_quote(sub_text)
|
||||||
|
plain_text += sub_text
|
||||||
plain_text += elem.tail or ""
|
plain_text += elem.tail or ""
|
||||||
return plain_text
|
return plain_text
|
||||||
|
|
||||||
|
|||||||
@@ -15,9 +15,10 @@ import random
|
|||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
from django.db import connection
|
from django.db import connection
|
||||||
from django.utils.timezone import now as timezone_now
|
from django.utils.timezone import now as timezone_now
|
||||||
from typing import Any, Dict, List, Tuple
|
from django.forms.models import model_to_dict
|
||||||
|
from typing import Any, Dict, List, Optional, Tuple
|
||||||
from zerver.forms import check_subdomain_available
|
from zerver.forms import check_subdomain_available
|
||||||
from zerver.models import Reaction, RealmEmoji
|
from zerver.models import Reaction, RealmEmoji, Realm
|
||||||
from zerver.lib.slack_message_conversion import convert_to_zulip_markdown, \
|
from zerver.lib.slack_message_conversion import convert_to_zulip_markdown, \
|
||||||
get_user_full_name
|
get_user_full_name
|
||||||
from zerver.lib.parallel import run_parallel
|
from zerver.lib.parallel import run_parallel
|
||||||
@@ -37,7 +38,7 @@ def rm_tree(path: str) -> None:
|
|||||||
shutil.rmtree(path)
|
shutil.rmtree(path)
|
||||||
|
|
||||||
def slack_workspace_to_realm(domain_name: str, realm_id: int, user_list: List[ZerverFieldsT],
|
def slack_workspace_to_realm(domain_name: str, realm_id: int, user_list: List[ZerverFieldsT],
|
||||||
realm_subdomain: str, fixtures_path: str, slack_data_dir: str,
|
realm_subdomain: str, slack_data_dir: str,
|
||||||
custom_emoji_list: ZerverFieldsT)-> Tuple[ZerverFieldsT, AddedUsersT,
|
custom_emoji_list: ZerverFieldsT)-> Tuple[ZerverFieldsT, AddedUsersT,
|
||||||
AddedRecipientsT,
|
AddedRecipientsT,
|
||||||
AddedChannelsT,
|
AddedChannelsT,
|
||||||
@@ -54,7 +55,7 @@ def slack_workspace_to_realm(domain_name: str, realm_id: int, user_list: List[Ze
|
|||||||
"""
|
"""
|
||||||
NOW = float(timezone_now().timestamp())
|
NOW = float(timezone_now().timestamp())
|
||||||
|
|
||||||
zerver_realm = build_zerver_realm(fixtures_path, realm_id, realm_subdomain, NOW)
|
zerver_realm = build_zerver_realm(realm_id, realm_subdomain, NOW)
|
||||||
|
|
||||||
realm = dict(zerver_client=[{"name": "populate_db", "id": 1},
|
realm = dict(zerver_client=[{"name": "populate_db", "id": 1},
|
||||||
{"name": "website", "id": 2},
|
{"name": "website", "id": 2},
|
||||||
@@ -99,17 +100,15 @@ def slack_workspace_to_realm(domain_name: str, realm_id: int, user_list: List[Ze
|
|||||||
|
|
||||||
return realm, added_users, added_recipient, added_channels, avatars, emoji_url_map
|
return realm, added_users, added_recipient, added_channels, avatars, emoji_url_map
|
||||||
|
|
||||||
def build_zerver_realm(fixtures_path: str, realm_id: int, realm_subdomain: str,
|
def build_zerver_realm(realm_id: int, realm_subdomain: str,
|
||||||
time: float) -> List[ZerverFieldsT]:
|
time: float) -> List[ZerverFieldsT]:
|
||||||
|
realm = Realm(id=realm_id, date_created=time,
|
||||||
zerver_realm_skeleton = get_data_file(fixtures_path + 'zerver_realm_skeleton.json')
|
name=realm_subdomain, string_id=realm_subdomain,
|
||||||
|
description="Organization imported from Slack!")
|
||||||
zerver_realm_skeleton[0]['id'] = realm_id
|
auth_methods = [[flag[0], flag[1]] for flag in realm.authentication_methods]
|
||||||
zerver_realm_skeleton[0]['string_id'] = realm_subdomain # subdomain / short_name of realm
|
realm_dict = model_to_dict(realm, exclude='authentication_methods')
|
||||||
zerver_realm_skeleton[0]['name'] = realm_subdomain
|
realm_dict['authentication_methods'] = auth_methods
|
||||||
zerver_realm_skeleton[0]['date_created'] = time
|
return[realm_dict]
|
||||||
|
|
||||||
return zerver_realm_skeleton
|
|
||||||
|
|
||||||
def build_realmemoji(custom_emoji_list: ZerverFieldsT,
|
def build_realmemoji(custom_emoji_list: ZerverFieldsT,
|
||||||
realm_id: int) -> Tuple[List[ZerverFieldsT],
|
realm_id: int) -> Tuple[List[ZerverFieldsT],
|
||||||
@@ -639,12 +638,25 @@ def channel_message_to_zerver_message(realm_id: int, users: List[ZerverFieldsT],
|
|||||||
# Ignore messages without user names
|
# Ignore messages without user names
|
||||||
# These are Sometimes produced by slack
|
# These are Sometimes produced by slack
|
||||||
continue
|
continue
|
||||||
|
if message.get('subtype') in [
|
||||||
|
# Zulip doesn't have a pinned_item concept
|
||||||
|
"pinned_item",
|
||||||
|
"unpinned_item",
|
||||||
|
# Slack's channel join/leave notices are spammy
|
||||||
|
"channel_join",
|
||||||
|
"channel_leave",
|
||||||
|
"channel_name"
|
||||||
|
]:
|
||||||
|
continue
|
||||||
|
|
||||||
has_attachment = has_image = False
|
has_attachment = has_image = False
|
||||||
content, mentioned_users_id, has_link = convert_to_zulip_markdown(message['text'],
|
try:
|
||||||
users,
|
content, mentioned_users_id, has_link = convert_to_zulip_markdown(
|
||||||
added_channels,
|
message['text'], users, added_channels, added_users)
|
||||||
added_users)
|
except Exception:
|
||||||
|
print("Slack message unexpectedly missing text representation:")
|
||||||
|
print(json.dumps(message, indent=4))
|
||||||
|
continue
|
||||||
rendered_content = None
|
rendered_content = None
|
||||||
|
|
||||||
recipient_id = added_recipient[message['channel_name']]
|
recipient_id = added_recipient[message['channel_name']]
|
||||||
@@ -659,14 +671,11 @@ def channel_message_to_zerver_message(realm_id: int, users: List[ZerverFieldsT],
|
|||||||
# Process different subtypes of slack messages
|
# Process different subtypes of slack messages
|
||||||
if 'subtype' in message.keys():
|
if 'subtype' in message.keys():
|
||||||
subtype = message['subtype']
|
subtype = message['subtype']
|
||||||
if subtype in ["channel_join", "channel_leave", "channel_name"]:
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Subtypes which have only the action in the message should
|
# Subtypes which have only the action in the message should
|
||||||
# be rendered with '/me' in the content initially
|
# be rendered with '/me' in the content initially
|
||||||
# For example "sh_room_created" has the message 'started a call'
|
# For example "sh_room_created" has the message 'started a call'
|
||||||
# which should be displayed as '/me started a call'
|
# which should be displayed as '/me started a call'
|
||||||
elif subtype in ["bot_add", "sh_room_created", "me_message"]:
|
if subtype in ["bot_add", "sh_room_created", "me_message"]:
|
||||||
content = ('/me %s' % (content))
|
content = ('/me %s' % (content))
|
||||||
|
|
||||||
# For attachments with slack download link
|
# For attachments with slack download link
|
||||||
@@ -808,12 +817,12 @@ def build_zerver_attachment(realm_id: int, message_id: int, attachment_id: int,
|
|||||||
file_name=fileinfo['name'])
|
file_name=fileinfo['name'])
|
||||||
zerver_attachment.append(attachment)
|
zerver_attachment.append(attachment)
|
||||||
|
|
||||||
def get_message_sending_user(message: ZerverFieldsT) -> str:
|
def get_message_sending_user(message: ZerverFieldsT) -> Optional[str]:
|
||||||
try:
|
if 'user' in message:
|
||||||
user = message.get('user', message['file']['user'])
|
return message['user']
|
||||||
except KeyError:
|
if message.get('file'):
|
||||||
user = message.get('user')
|
return message['file'].get('user')
|
||||||
return user
|
return None
|
||||||
|
|
||||||
def build_zerver_usermessage(zerver_usermessage: List[ZerverFieldsT], usermessage_id: int,
|
def build_zerver_usermessage(zerver_usermessage: List[ZerverFieldsT], usermessage_id: int,
|
||||||
zerver_subscription: List[ZerverFieldsT], recipient_id: int,
|
zerver_subscription: List[ZerverFieldsT], recipient_id: int,
|
||||||
@@ -836,6 +845,7 @@ def build_zerver_usermessage(zerver_usermessage: List[ZerverFieldsT], usermessag
|
|||||||
def do_convert_data(slack_zip_file: str, output_dir: str, token: str, threads: int=6) -> None:
|
def do_convert_data(slack_zip_file: str, output_dir: str, token: str, threads: int=6) -> None:
|
||||||
# Subdomain is set by the user while running the import command
|
# Subdomain is set by the user while running the import command
|
||||||
realm_subdomain = ""
|
realm_subdomain = ""
|
||||||
|
realm_id = 0
|
||||||
domain_name = settings.EXTERNAL_HOST
|
domain_name = settings.EXTERNAL_HOST
|
||||||
|
|
||||||
slack_data_dir = slack_zip_file.replace('.zip', '')
|
slack_data_dir = slack_zip_file.replace('.zip', '')
|
||||||
@@ -851,11 +861,6 @@ def do_convert_data(slack_zip_file: str, output_dir: str, token: str, threads: i
|
|||||||
# with zipfile.ZipFile(slack_zip_file, 'r') as zip_ref:
|
# with zipfile.ZipFile(slack_zip_file, 'r') as zip_ref:
|
||||||
# zip_ref.extractall(slack_data_dir)
|
# zip_ref.extractall(slack_data_dir)
|
||||||
|
|
||||||
script_path = os.path.dirname(os.path.abspath(__file__)) + '/'
|
|
||||||
fixtures_path = script_path + '../fixtures/'
|
|
||||||
|
|
||||||
realm_id = 0
|
|
||||||
|
|
||||||
# We get the user data from the legacy token method of slack api, which is depreciated
|
# We get the user data from the legacy token method of slack api, which is depreciated
|
||||||
# but we use it as the user email data is provided only in this method
|
# but we use it as the user email data is provided only in this method
|
||||||
user_list = get_slack_api_data(token, "https://slack.com/api/users.list", "members")
|
user_list = get_slack_api_data(token, "https://slack.com/api/users.list", "members")
|
||||||
@@ -864,7 +869,7 @@ def do_convert_data(slack_zip_file: str, output_dir: str, token: str, threads: i
|
|||||||
|
|
||||||
realm, added_users, added_recipient, added_channels, avatar_list, \
|
realm, added_users, added_recipient, added_channels, avatar_list, \
|
||||||
emoji_url_map = slack_workspace_to_realm(domain_name, realm_id, user_list,
|
emoji_url_map = slack_workspace_to_realm(domain_name, realm_id, user_list,
|
||||||
realm_subdomain, fixtures_path,
|
realm_subdomain,
|
||||||
slack_data_dir, custom_emoji_list)
|
slack_data_dir, custom_emoji_list)
|
||||||
|
|
||||||
message_json, uploads_list, zerver_attachment = convert_slack_workspace_messages(
|
message_json, uploads_list, zerver_attachment = convert_slack_workspace_messages(
|
||||||
|
|||||||
@@ -5,16 +5,10 @@ from typing import Any
|
|||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
|
|
||||||
|
from zerver.lib.management import check_config
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = """Checks your Zulip Voyager Django configuration for issues."""
|
help = """Checks your Zulip Voyager Django configuration for issues."""
|
||||||
|
|
||||||
def handle(self, *args: Any, **options: Any) -> None:
|
def handle(self, *args: Any, **options: Any) -> None:
|
||||||
for (setting_name, default) in settings.REQUIRED_SETTINGS:
|
check_config()
|
||||||
try:
|
|
||||||
if settings.__getattr__(setting_name) != default:
|
|
||||||
continue
|
|
||||||
except AttributeError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
print("Error: You must set %s in /etc/zulip/settings.py." % (setting_name,))
|
|
||||||
sys.exit(1)
|
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ from django.conf import settings
|
|||||||
from django.core.management import call_command
|
from django.core.management import call_command
|
||||||
from django.core.management.base import BaseCommand, CommandParser
|
from django.core.management.base import BaseCommand, CommandParser
|
||||||
|
|
||||||
from zerver.lib.export import do_import_realm, do_import_system_bots
|
from zerver.lib.import_realm import do_import_realm, do_import_system_bots
|
||||||
from zerver.forms import check_subdomain_available
|
from zerver.forms import check_subdomain_available
|
||||||
from zerver.models import Client, DefaultStream, Huddle, \
|
from zerver.models import Client, DefaultStream, Huddle, \
|
||||||
Message, Realm, RealmDomain, RealmFilter, Recipient, \
|
Message, Realm, RealmDomain, RealmFilter, Recipient, \
|
||||||
@@ -65,7 +65,6 @@ import a database dump from one or more JSON files."""
|
|||||||
if subdomain is None:
|
if subdomain is None:
|
||||||
print("Enter subdomain!")
|
print("Enter subdomain!")
|
||||||
exit(1)
|
exit(1)
|
||||||
check_subdomain_available(subdomain)
|
|
||||||
|
|
||||||
if options["destroy_rebuild_database"]:
|
if options["destroy_rebuild_database"]:
|
||||||
print("Rebuilding the database!")
|
print("Rebuilding the database!")
|
||||||
@@ -75,6 +74,8 @@ import a database dump from one or more JSON files."""
|
|||||||
for model in models_to_import:
|
for model in models_to_import:
|
||||||
self.new_instance_check(model)
|
self.new_instance_check(model)
|
||||||
|
|
||||||
|
check_subdomain_available(subdomain, from_management_command=True)
|
||||||
|
|
||||||
for path in options['export_files']:
|
for path in options['export_files']:
|
||||||
if not os.path.exists(path):
|
if not os.path.exists(path):
|
||||||
print("Directory not found: '%s'" % (path,))
|
print("Directory not found: '%s'" % (path,))
|
||||||
|
|||||||
94
zerver/management/commands/register_server.py
Normal file
94
zerver/management/commands/register_server.py
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
from argparse import ArgumentParser
|
||||||
|
import json
|
||||||
|
import requests
|
||||||
|
import subprocess
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.management.base import CommandError
|
||||||
|
from django.utils.crypto import get_random_string
|
||||||
|
|
||||||
|
from zerver.lib.management import ZulipBaseCommand, check_config
|
||||||
|
|
||||||
|
if settings.DEVELOPMENT:
|
||||||
|
SECRETS_FILENAME = "zproject/dev-secrets.conf"
|
||||||
|
else:
|
||||||
|
SECRETS_FILENAME = "/etc/zulip/zulip-secrets.conf"
|
||||||
|
|
||||||
|
class Command(ZulipBaseCommand):
|
||||||
|
help = """Register a remote Zulip server for push notifications."""
|
||||||
|
|
||||||
|
def add_arguments(self, parser: ArgumentParser) -> None:
|
||||||
|
parser.add_argument('--agree_to_terms_of_service',
|
||||||
|
dest='agree_to_terms_of_service',
|
||||||
|
action='store_true',
|
||||||
|
default=False,
|
||||||
|
help="Agree to the Zulipchat Terms of Service: https://zulipchat.com/terms/.")
|
||||||
|
parser.add_argument('--rotate-key',
|
||||||
|
dest="rotate_key",
|
||||||
|
action='store_true',
|
||||||
|
default=False,
|
||||||
|
help="Automatically rotate your server's zulip_org_key")
|
||||||
|
|
||||||
|
def handle(self, **options: Any) -> None:
|
||||||
|
if not settings.DEVELOPMENT:
|
||||||
|
check_config()
|
||||||
|
|
||||||
|
if not options['agree_to_terms_of_service'] and not options["rotate_key"]:
|
||||||
|
raise CommandError(
|
||||||
|
"You must agree to the Zulipchat Terms of Service: https://zulipchat.com/terms/. Run as:\n"
|
||||||
|
" python manage.py register_remote_server --agree_to_terms_of_service\n")
|
||||||
|
|
||||||
|
if not settings.ZULIP_ORG_ID:
|
||||||
|
raise CommandError("Missing zulip_org_id; run scripts/setup/generate_secrets.py to generate.")
|
||||||
|
if not settings.ZULIP_ORG_KEY:
|
||||||
|
raise CommandError("Missing zulip_org_key; run scripts/setup/generate_secrets.py to generate.")
|
||||||
|
if settings.PUSH_NOTIFICATION_BOUNCER_URL is None:
|
||||||
|
if settings.DEVELOPMENT:
|
||||||
|
settings.PUSH_NOTIFICATION_BOUNCER_URL = (settings.EXTERNAL_URI_SCHEME +
|
||||||
|
settings.EXTERNAL_HOST)
|
||||||
|
else:
|
||||||
|
raise CommandError("Please uncomment PUSH_NOTIFICATION_BOUNCER_URL "
|
||||||
|
"in /etc/zulip/settings.py (remove the '#')")
|
||||||
|
|
||||||
|
request = {
|
||||||
|
"zulip_org_id": settings.ZULIP_ORG_ID,
|
||||||
|
"zulip_org_key": settings.ZULIP_ORG_KEY,
|
||||||
|
"hostname": settings.EXTERNAL_HOST,
|
||||||
|
"contact_email": settings.ZULIP_ADMINISTRATOR}
|
||||||
|
if options["rotate_key"]:
|
||||||
|
request["new_org_key"] = get_random_string(64)
|
||||||
|
|
||||||
|
print("The following data will be submitted to the push notification service:")
|
||||||
|
for key in sorted(request.keys()):
|
||||||
|
print(" %s: %s" % (key, request[key]))
|
||||||
|
print("")
|
||||||
|
|
||||||
|
if not options['agree_to_terms_of_service'] and not options["rotate_key"]:
|
||||||
|
raise CommandError(
|
||||||
|
"You must agree to the Terms of Service: https://zulipchat.com/terms/\n"
|
||||||
|
" python manage.py register_remote_server --agree_to_terms_of_service\n")
|
||||||
|
|
||||||
|
registration_url = settings.PUSH_NOTIFICATION_BOUNCER_URL + "/api/v1/remotes/server/register"
|
||||||
|
try:
|
||||||
|
response = requests.post(registration_url, params=request)
|
||||||
|
except Exception:
|
||||||
|
raise CommandError("Network error connecting to push notifications service (%s)"
|
||||||
|
% (settings.PUSH_NOTIFICATION_BOUNCER_URL,))
|
||||||
|
try:
|
||||||
|
response.raise_for_status()
|
||||||
|
except Exception:
|
||||||
|
content_dict = json.loads(response.content.decode("utf-8"))
|
||||||
|
raise CommandError("Error: " + content_dict['msg'])
|
||||||
|
|
||||||
|
if response.json()['created']:
|
||||||
|
print("You've successfully registered for the Mobile Push Notification Service!\n"
|
||||||
|
"To finish setup for sending push notifications:")
|
||||||
|
print("- Restart the server, using /home/zulip/deployments/current/scripts/restart-server")
|
||||||
|
print("- Return to the documentation to learn how to test push notifications")
|
||||||
|
else:
|
||||||
|
if options["rotate_key"]:
|
||||||
|
print("Success! Updating %s with the new key..." % (SECRETS_FILENAME,))
|
||||||
|
subprocess.check_call(["crudini", '--set', SECRETS_FILENAME, "secrets", "zulip_org_key",
|
||||||
|
request["new_org_key"]])
|
||||||
|
print("Mobile Push Notification Service registration successfully updated!")
|
||||||
@@ -101,6 +101,30 @@ class TestStreamEmailMessagesSuccess(ZulipTestCase):
|
|||||||
self.assertEqual(get_display_recipient(message.recipient), stream.name)
|
self.assertEqual(get_display_recipient(message.recipient), stream.name)
|
||||||
self.assertEqual(message.topic_name(), incoming_valid_message['Subject'])
|
self.assertEqual(message.topic_name(), incoming_valid_message['Subject'])
|
||||||
|
|
||||||
|
def test_receive_stream_email_messages_blank_subject_success(self) -> None:
|
||||||
|
user_profile = self.example_user('hamlet')
|
||||||
|
self.login(user_profile.email)
|
||||||
|
self.subscribe(user_profile, "Denmark")
|
||||||
|
stream = get_stream("Denmark", user_profile.realm)
|
||||||
|
|
||||||
|
stream_to_address = encode_email_address(stream)
|
||||||
|
|
||||||
|
incoming_valid_message = MIMEText('TestStreamEmailMessages Body') # type: Any # https://github.com/python/typeshed/issues/275
|
||||||
|
|
||||||
|
incoming_valid_message['Subject'] = ''
|
||||||
|
incoming_valid_message['From'] = self.example_email('hamlet')
|
||||||
|
incoming_valid_message['To'] = stream_to_address
|
||||||
|
incoming_valid_message['Reply-to'] = self.example_email('othello')
|
||||||
|
|
||||||
|
process_message(incoming_valid_message)
|
||||||
|
|
||||||
|
# Hamlet is subscribed to this stream so should see the email message from Othello.
|
||||||
|
message = most_recent_message(user_profile)
|
||||||
|
|
||||||
|
self.assertEqual(message.content, "TestStreamEmailMessages Body")
|
||||||
|
self.assertEqual(get_display_recipient(message.recipient), stream.name)
|
||||||
|
self.assertEqual(message.topic_name(), "(no topic)")
|
||||||
|
|
||||||
class TestStreamEmailMessagesEmptyBody(ZulipTestCase):
|
class TestStreamEmailMessagesEmptyBody(ZulipTestCase):
|
||||||
def test_receive_stream_email_messages_empty_body(self) -> None:
|
def test_receive_stream_email_messages_empty_body(self) -> None:
|
||||||
|
|
||||||
|
|||||||
@@ -11,7 +11,7 @@ from django.conf import settings
|
|||||||
from django.core.management import call_command
|
from django.core.management import call_command
|
||||||
from django.test import TestCase, override_settings
|
from django.test import TestCase, override_settings
|
||||||
from zerver.lib.actions import do_create_user
|
from zerver.lib.actions import do_create_user
|
||||||
from zerver.lib.management import ZulipBaseCommand, CommandError
|
from zerver.lib.management import ZulipBaseCommand, CommandError, check_config
|
||||||
from zerver.lib.test_classes import ZulipTestCase
|
from zerver.lib.test_classes import ZulipTestCase
|
||||||
from zerver.lib.test_helpers import stdout_suppressed
|
from zerver.lib.test_helpers import stdout_suppressed
|
||||||
from zerver.lib.test_runner import slow
|
from zerver.lib.test_runner import slow
|
||||||
@@ -20,6 +20,12 @@ from zerver.models import get_user_profile_by_email
|
|||||||
from zerver.models import get_realm, UserProfile, Realm
|
from zerver.models import get_realm, UserProfile, Realm
|
||||||
from confirmation.models import RealmCreationKey, generate_realm_creation_url
|
from confirmation.models import RealmCreationKey, generate_realm_creation_url
|
||||||
|
|
||||||
|
class TestCheckConfig(ZulipTestCase):
|
||||||
|
def test_check_config(self) -> None:
|
||||||
|
with self.assertRaisesRegex(CommandError, "Error: You must set ZULIP_ADMINISTRATOR in /etc/zulip/settings.py."):
|
||||||
|
check_config()
|
||||||
|
|
||||||
|
|
||||||
class TestZulipBaseCommand(ZulipTestCase):
|
class TestZulipBaseCommand(ZulipTestCase):
|
||||||
def setUp(self) -> None:
|
def setUp(self) -> None:
|
||||||
self.zulip_realm = get_realm("zulip")
|
self.zulip_realm = get_realm("zulip")
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ from django.contrib.sites.models import Site
|
|||||||
from django.http import HttpResponse
|
from django.http import HttpResponse
|
||||||
from django.test import TestCase, override_settings
|
from django.test import TestCase, override_settings
|
||||||
from django.utils.timezone import now as timezone_now
|
from django.utils.timezone import now as timezone_now
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
from mock import patch, MagicMock
|
from mock import patch, MagicMock
|
||||||
from zerver.lib.test_helpers import MockLDAP
|
from zerver.lib.test_helpers import MockLDAP
|
||||||
@@ -14,7 +15,7 @@ from confirmation.models import Confirmation, create_confirmation_link, Multiuse
|
|||||||
generate_key, confirmation_url, get_object_from_key, ConfirmationKeyException
|
generate_key, confirmation_url, get_object_from_key, ConfirmationKeyException
|
||||||
from confirmation import settings as confirmation_settings
|
from confirmation import settings as confirmation_settings
|
||||||
|
|
||||||
from zerver.forms import HomepageForm, WRONG_SUBDOMAIN_ERROR
|
from zerver.forms import HomepageForm, WRONG_SUBDOMAIN_ERROR, check_subdomain_available
|
||||||
from zerver.lib.actions import do_change_password
|
from zerver.lib.actions import do_change_password
|
||||||
from zerver.views.auth import login_or_register_remote_user, \
|
from zerver.views.auth import login_or_register_remote_user, \
|
||||||
redirect_and_log_into_subdomain
|
redirect_and_log_into_subdomain
|
||||||
@@ -1521,6 +1522,15 @@ class RealmCreationTest(ZulipTestCase):
|
|||||||
self.assert_in_success_response(["available"], result)
|
self.assert_in_success_response(["available"], result)
|
||||||
self.assert_not_in_success_response(["unavailable"], result)
|
self.assert_not_in_success_response(["unavailable"], result)
|
||||||
|
|
||||||
|
def test_subdomain_check_management_command(self) -> None:
|
||||||
|
# Short names should work
|
||||||
|
check_subdomain_available('aa', from_management_command=True)
|
||||||
|
# So should reserved ones
|
||||||
|
check_subdomain_available('zulip', from_management_command=True)
|
||||||
|
# malformed names should still not
|
||||||
|
with self.assertRaises(ValidationError):
|
||||||
|
check_subdomain_available('-ba_d-', from_management_command=True)
|
||||||
|
|
||||||
class UserSignUpTest(ZulipTestCase):
|
class UserSignUpTest(ZulipTestCase):
|
||||||
|
|
||||||
def _assert_redirected_to(self, result: HttpResponse, url: Text) -> None:
|
def _assert_redirected_to(self, result: HttpResponse, url: Text) -> None:
|
||||||
|
|||||||
@@ -24,7 +24,7 @@ from zerver.lib.slack_data_to_zulip_data import (
|
|||||||
do_convert_data,
|
do_convert_data,
|
||||||
process_avatars,
|
process_avatars,
|
||||||
)
|
)
|
||||||
from zerver.lib.export import (
|
from zerver.lib.import_realm import (
|
||||||
do_import_realm,
|
do_import_realm,
|
||||||
)
|
)
|
||||||
from zerver.lib.avatar_hash import (
|
from zerver.lib.avatar_hash import (
|
||||||
@@ -92,11 +92,10 @@ class SlackImporter(ZulipTestCase):
|
|||||||
self.assertEqual(invalid.exception.args, ('Something went wrong. Please try again!',),)
|
self.assertEqual(invalid.exception.args, ('Something went wrong. Please try again!',),)
|
||||||
|
|
||||||
def test_build_zerver_realm(self) -> None:
|
def test_build_zerver_realm(self) -> None:
|
||||||
fixtures_path = os.path.dirname(os.path.abspath(__file__)) + '/../fixtures/'
|
|
||||||
realm_id = 2
|
realm_id = 2
|
||||||
realm_subdomain = "test-realm"
|
realm_subdomain = "test-realm"
|
||||||
time = float(timezone_now().timestamp())
|
time = float(timezone_now().timestamp())
|
||||||
test_realm = build_zerver_realm(fixtures_path, realm_id, realm_subdomain, time)
|
test_realm = build_zerver_realm(realm_id, realm_subdomain, time)
|
||||||
test_zerver_realm_dict = test_realm[0]
|
test_zerver_realm_dict = test_realm[0]
|
||||||
|
|
||||||
self.assertEqual(test_zerver_realm_dict['id'], realm_id)
|
self.assertEqual(test_zerver_realm_dict['id'], realm_id)
|
||||||
@@ -323,7 +322,7 @@ class SlackImporter(ZulipTestCase):
|
|||||||
realm_id = 1
|
realm_id = 1
|
||||||
user_list = [] # type: List[Dict[str, Any]]
|
user_list = [] # type: List[Dict[str, Any]]
|
||||||
realm, added_users, added_recipient, added_channels, avatar_list, em = slack_workspace_to_realm(
|
realm, added_users, added_recipient, added_channels, avatar_list, em = slack_workspace_to_realm(
|
||||||
'testdomain', realm_id, user_list, 'test-realm', './fixtures', './random_path', {})
|
'testdomain', realm_id, user_list, 'test-realm', './random_path', {})
|
||||||
test_zerver_realmdomain = [{'realm': realm_id, 'allow_subdomains': False,
|
test_zerver_realmdomain = [{'realm': realm_id, 'allow_subdomains': False,
|
||||||
'domain': 'testdomain', 'id': realm_id}]
|
'domain': 'testdomain', 'id': realm_id}]
|
||||||
# Functioning already tests in helper functions
|
# Functioning already tests in helper functions
|
||||||
@@ -398,6 +397,11 @@ class SlackImporter(ZulipTestCase):
|
|||||||
"ts": "1463868370.000008", "channel_name": "general"},
|
"ts": "1463868370.000008", "channel_name": "general"},
|
||||||
{"text": "test message 2", "user": "U061A5N1G",
|
{"text": "test message 2", "user": "U061A5N1G",
|
||||||
"ts": "1433868549.000010", "channel_name": "general"},
|
"ts": "1433868549.000010", "channel_name": "general"},
|
||||||
|
# This message will be ignored since it has no user and file is None.
|
||||||
|
# See #9217 for the situation; likely file uploads on archived channels
|
||||||
|
{'upload': False, 'file': None, 'text': 'A file was shared',
|
||||||
|
'channel_name': 'general', 'type': 'message', 'ts': '1433868549.000011',
|
||||||
|
'subtype': 'file_share'},
|
||||||
{"text": "random test", "user": "U061A1R2R",
|
{"text": "random test", "user": "U061A1R2R",
|
||||||
"ts": "1433868669.000012", "channel_name": "general"}] # type: List[Dict[str, Any]]
|
"ts": "1433868669.000012", "channel_name": "general"}] # type: List[Dict[str, Any]]
|
||||||
|
|
||||||
|
|||||||
@@ -16,3 +16,11 @@ Receive GitLab notifications in Zulip!
|
|||||||
{!congrats.md!}
|
{!congrats.md!}
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
|
!!! tip ""
|
||||||
|
If your GitLab server and your Zulip server are on a local network
|
||||||
|
together, and you're running GitLab 10.5 or newer, you may need to enable
|
||||||
|
GitLab's "Allow requests to the local network from hooks and
|
||||||
|
services" setting (by default, recent GitLab versions refuse to post
|
||||||
|
webhook events to servers on the local network). You can find this
|
||||||
|
setting near the bottom of the GitLab "Settings" page in the "Admin area".
|
||||||
|
|||||||
@@ -133,6 +133,8 @@ DEFAULT_SETTINGS = {
|
|||||||
# LDAP auth
|
# LDAP auth
|
||||||
'AUTH_LDAP_SERVER_URI': "",
|
'AUTH_LDAP_SERVER_URI': "",
|
||||||
'LDAP_EMAIL_ATTR': None,
|
'LDAP_EMAIL_ATTR': None,
|
||||||
|
# Disable django-auth-ldap caching, to prevent problems with OU changes.
|
||||||
|
'AUTH_LDAP_GROUP_CACHE_TIMEOUT': 0,
|
||||||
|
|
||||||
# Social auth
|
# Social auth
|
||||||
'SOCIAL_AUTH_GITHUB_KEY': None,
|
'SOCIAL_AUTH_GITHUB_KEY': None,
|
||||||
|
|||||||
Reference in New Issue
Block a user